Apr 16 22:11:05.400981 ip-10-0-141-169 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:11:05.400992 ip-10-0-141-169 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:11:05.400999 ip-10-0-141-169 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:11:05.401271 ip-10-0-141-169 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:15.646704 ip-10-0-141-169 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:15.646722 ip-10-0-141-169 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5af1906e6954453abb2f8459e00529a6 -- Apr 16 22:13:46.334710 ip-10-0-141-169 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:46.776884 ip-10-0-141-169 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:46.776884 ip-10-0-141-169 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:46.776884 ip-10-0-141-169 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:46.776884 ip-10-0-141-169 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:46.776884 ip-10-0-141-169 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:46.779206 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.779109 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:46.781527 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781511 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:46.781527 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781527 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781530 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781535 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781538 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781541 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781544 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781546 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781549 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781552 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781554 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781557 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781561 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781564 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781567 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781569 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781572 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781574 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781577 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781580 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:46.781590 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781582 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781585 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781589 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781593 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781597 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781600 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781603 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781606 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781608 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781611 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781613 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781616 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781618 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781621 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781623 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781626 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781629 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781631 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781634 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781636 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:46.782065 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781639 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781644 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781647 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781650 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781652 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781655 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781658 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781660 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781663 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781665 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781668 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781670 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781673 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781676 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781679 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781682 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781684 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781687 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781690 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781692 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:46.782676 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781695 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781697 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781700 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781702 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781705 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781708 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781710 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781712 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781715 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781718 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781720 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781723 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781726 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781728 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781732 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781735 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781738 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781740 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781743 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781745 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:46.783168 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781748 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781750 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781753 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781755 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781758 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.781761 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782156 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782161 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782164 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782167 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782170 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782172 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782175 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782178 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782181 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782183 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782187 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782189 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782192 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782195 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:46.783678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782197 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782201 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782204 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782207 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782209 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782212 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782214 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782218 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782220 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782223 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782225 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782228 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782230 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782233 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782235 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782238 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782240 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782245 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782249 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:46.784151 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782252 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782255 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782258 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782261 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782264 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782267 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782269 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782272 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782275 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782277 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782281 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782283 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782285 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782288 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782290 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782293 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782295 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782298 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782300 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782302 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782305 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:46.784670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782307 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782311 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782313 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782316 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782318 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782321 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782324 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782327 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782330 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782332 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782335 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782337 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782340 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782342 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782345 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782347 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782350 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782352 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782355 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782357 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:46.785190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782360 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782362 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782366 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782368 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782370 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782373 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782375 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782379 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782382 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782384 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782387 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.782390 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783101 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783111 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783118 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783122 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783127 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783130 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783135 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783140 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783143 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:46.785697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783146 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783150 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783153 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783157 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783159 2565 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783163 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783166 2565 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783169 2565 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783172 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783176 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783180 2565 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783183 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783186 2565 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783189 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783193 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783206 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783209 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783212 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783216 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783219 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783222 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783225 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783228 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783231 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783235 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:46.786196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783238 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783241 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783244 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783251 2565 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783254 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783259 2565 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783262 2565 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783265 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783269 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783272 2565 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783276 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783279 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783282 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783285 2565 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783288 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783291 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783295 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783298 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783301 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783304 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783307 2565 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783311 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783315 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783318 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783322 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783325 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:46.786813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783328 2565 flags.go:64] FLAG: --help="false" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783331 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-141-169.ec2.internal" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783334 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783337 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783340 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783343 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783347 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783350 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783352 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783357 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783360 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783363 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783366 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783369 2565 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783372 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783375 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783378 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783381 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783384 2565 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783387 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783390 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783393 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783400 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:46.787457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783403 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783421 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783424 2565 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783427 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783430 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783433 2565 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783437 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783442 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783445 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783449 2565 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783452 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783455 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783458 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783461 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783464 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783467 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783470 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783478 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783482 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783485 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783489 2565 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783492 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783502 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783505 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:46.788009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783508 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783511 2565 flags.go:64] FLAG: --port="10250" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783514 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783517 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09961190f56f0f709" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783520 2565 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783523 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783526 2565 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783531 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783534 2565 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783537 2565 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783540 2565 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783543 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783546 2565 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783550 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783553 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783556 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783559 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783562 2565 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783565 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783568 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783571 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783574 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783577 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783580 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783584 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783586 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:46.788593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783589 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783595 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783598 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783601 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783604 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783607 2565 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783610 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783615 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783618 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783621 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783625 2565 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783628 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783631 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783636 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783639 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783642 2565 flags.go:64] FLAG: --v="2" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783651 2565 flags.go:64] FLAG: --version="false" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783655 2565 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783659 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.783662 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783761 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783765 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783769 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783771 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:46.789204 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783774 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783777 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783779 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783782 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783784 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783787 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783789 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783792 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783794 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783798 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783801 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783803 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783806 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783808 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783811 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783813 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783816 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783818 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783821 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:46.789788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783825 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783829 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783836 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783839 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783842 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783845 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783848 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783850 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783853 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783856 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783858 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783861 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783865 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783868 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783871 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783874 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783876 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783879 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783881 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783884 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:46.790315 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783886 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783889 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783892 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783894 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783897 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783900 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783902 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783905 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783907 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783910 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783913 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783915 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783918 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.783920 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784711 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784716 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784720 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784723 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784727 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784729 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:46.790862 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784732 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784736 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784739 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784741 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784744 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784749 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784753 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784755 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784758 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784761 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784763 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784766 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784768 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784771 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784774 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784776 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784779 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784782 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784785 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:46.791548 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784787 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784790 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784793 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.784795 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.784803 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.791308 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.791352 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791534 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791544 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791550 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791558 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791565 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791569 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791580 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791585 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:46.792238 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791589 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791593 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791597 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791601 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791606 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791610 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791614 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791618 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791622 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791626 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791634 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791644 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791648 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791652 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791656 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791660 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791664 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791668 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791672 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791677 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:46.792709 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791681 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791685 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791689 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791698 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791703 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791708 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791712 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791716 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791721 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791725 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791729 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791733 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791737 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791741 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791745 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791749 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791758 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791763 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791767 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:46.793237 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791771 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791775 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791782 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791788 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791793 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791798 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791802 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791807 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791811 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791815 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791824 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791828 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791832 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791837 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791841 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791845 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791850 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791855 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791865 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:46.793780 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791869 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791873 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791878 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791887 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791891 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791895 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791903 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791907 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791911 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791915 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791919 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791924 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791928 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791932 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791937 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791946 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791950 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791954 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791958 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:46.794304 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.791962 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.791970 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792531 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792543 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792547 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792550 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792553 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792556 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792559 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792562 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792565 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792568 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792570 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792573 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792576 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792578 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:46.794789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792581 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792584 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792589 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792594 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792598 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792601 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792604 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792606 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792609 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792612 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792614 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792617 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792619 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792622 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792626 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792630 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792632 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792635 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792639 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:46.795210 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792642 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792645 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792647 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792650 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792652 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792655 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792657 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792660 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792663 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792665 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792669 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792672 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792674 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792677 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792679 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792682 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792684 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792687 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792689 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792692 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:46.795693 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792695 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792697 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792700 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792703 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792705 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792708 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792710 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792713 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792716 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792719 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792722 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792725 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792728 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792731 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792734 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792736 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792739 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792742 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792745 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792748 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:46.796187 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792750 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792752 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792755 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792758 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792761 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792763 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792766 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792768 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792771 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792773 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792776 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792778 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.792782 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.792788 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:46.796810 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.793658 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:46.797257 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.797243 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:46.798265 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.798253 2565 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:46.798367 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.798352 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:46.798403 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.798393 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:46.823219 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.823190 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:46.827558 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.827535 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:46.840424 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.840390 2565 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:46.846145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.846128 2565 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:46.849256 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.849237 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:46.853967 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.853940 2565 fs.go:135] Filesystem UUIDs: map[1318d6e4-e0e5-45f2-aff0-7ab4bb12722c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7bbdf20f-ce3b-4496-b15f-7d6e1eebc17d:/dev/nvme0n1p3] Apr 16 22:13:46.854063 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.853963 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:46.855397 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.855376 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:46.861230 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.861106 2565 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:46.858847835 +0000 UTC m=+0.404589254 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099577 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23a82143c89d1e8232606465934b7d SystemUUID:ec23a821-43c8-9d1e-8232-606465934b7d BootID:5af1906e-6954-453a-bb2f-8459e00529a6 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:24:6f:57:af:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:24:6f:57:af:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:18:e3:26:0d:45 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:46.861230 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.861216 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:46.861390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.861339 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:46.862458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.862430 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:46.862627 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.862460 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-169.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:46.862713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.862641 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:46.862713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.862652 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:46.862713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.862672 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:46.863504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.863493 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:46.864845 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.864832 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:46.864975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.864964 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:46.868218 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.868206 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:46.868281 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.868226 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:46.868281 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.868247 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:46.868281 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.868262 2565 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:46.868281 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.868275 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:46.869440 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.869427 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:46.869516 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.869451 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:46.872575 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.872561 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:46.874114 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.874098 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:46.876071 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876057 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876077 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876087 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876096 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876106 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876114 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876123 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876132 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876142 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:46.876159 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876151 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:46.876486 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876165 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:46.876486 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876178 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:46.876486 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.876343 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7856t" Apr 16 22:13:46.877111 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.877100 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:46.877161 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.877115 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:46.880832 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.880816 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:46.880928 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.880859 2565 server.go:1295] "Started kubelet" Apr 16 22:13:46.880977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.880941 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:46.881094 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.881044 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:46.881193 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.881178 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:46.881797 ip-10-0-141-169 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:46.882323 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.882308 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:46.882423 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.882386 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-169.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:46.882496 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.882485 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:46.882644 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.882492 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-169.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:46.882644 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.882498 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:46.884539 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.884521 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7856t" Apr 16 22:13:46.888455 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.887502 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-169.ec2.internal.18a6f60a3d2ee6c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-169.ec2.internal,UID:ip-10-0-141-169.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-169.ec2.internal,},FirstTimestamp:2026-04-16 22:13:46.880829126 +0000 UTC m=+0.426570572,LastTimestamp:2026-04-16 22:13:46.880829126 +0000 UTC m=+0.426570572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-169.ec2.internal,}" Apr 16 22:13:46.888573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.888516 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:46.888941 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.888928 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:46.889738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.889719 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:46.889825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.889779 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:46.889825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.889792 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:46.889931 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.889903 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:46.889931 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.889910 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:46.890652 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.890625 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:46.892342 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892321 2565 factory.go:55] Registering systemd factory Apr 16 22:13:46.892457 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892389 2565 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:46.892709 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892695 2565 factory.go:153] Registering CRI-O factory Apr 16 22:13:46.892792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892712 2565 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:46.892792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892766 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:46.892792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892791 2565 factory.go:103] Registering Raw factory Apr 16 22:13:46.892944 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.892807 2565 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:46.893506 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.893487 2565 manager.go:319] Starting recovery of all containers Apr 16 22:13:46.893933 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.893907 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:46.894065 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.894045 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:46.897807 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.897764 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-169.ec2.internal\" not found" node="ip-10-0-141-169.ec2.internal" Apr 16 22:13:46.903678 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:46.903505 2565 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max": read /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.max: no such device Apr 16 22:13:46.905428 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.905400 2565 manager.go:324] Recovery completed Apr 16 22:13:46.909624 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.909611 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:46.911943 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.911928 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:46.912002 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.911956 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:46.912002 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.911966 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:46.912421 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.912395 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:46.912503 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.912425 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:46.912503 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.912445 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:46.914392 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.914380 2565 policy_none.go:49] "None policy: Start" Apr 16 22:13:46.914448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.914402 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:46.914448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.914444 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:46.963048 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963021 2565 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:46.963200 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.963058 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:46.963200 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963068 2565 server.go:85] "Starting device plugin registration server" Apr 16 22:13:46.963391 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963377 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:46.963473 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963390 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:46.963473 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963462 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:46.963548 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963537 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:46.963600 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:46.963547 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:46.964146 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.964123 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:46.964239 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:46.964162 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.011990 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.011951 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:47.013321 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.013303 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:47.013394 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.013334 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:47.013394 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.013354 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:47.013394 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.013360 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:47.013551 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.013395 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:47.015793 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.015774 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:47.063712 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.063638 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:47.065729 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.065704 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:47.065813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.065737 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:47.065813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.065754 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:47.065813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.065777 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.073604 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.073587 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.073651 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.073620 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-169.ec2.internal\": node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.093196 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.093173 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.113832 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.113803 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal"] Apr 16 22:13:47.113897 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.113881 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:47.115760 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.115735 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:47.115887 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.115772 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:47.115887 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.115784 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:47.117180 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.117168 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:47.117345 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.117332 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.117385 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.117362 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:47.118390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118373 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:47.118390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118381 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:47.118562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118403 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:47.118562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118426 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:47.118562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118434 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:47.118562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.118438 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:47.119691 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.119675 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.119775 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.119703 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:47.120678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.120661 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:47.120760 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.120689 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:47.120760 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.120699 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:47.142187 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.142158 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-169.ec2.internal\" not found" node="ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.145651 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.145629 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-169.ec2.internal\" not found" node="ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.192262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.192232 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0fd34b83b170bf49f4fc19c252e8bbc-config\") pod \"kube-apiserver-proxy-ip-10-0-141-169.ec2.internal\" (UID: \"a0fd34b83b170bf49f4fc19c252e8bbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.192262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.192265 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.192395 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.192283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.193264 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.193247 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.292634 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292602 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.292634 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292636 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.292778 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0fd34b83b170bf49f4fc19c252e8bbc-config\") pod \"kube-apiserver-proxy-ip-10-0-141-169.ec2.internal\" (UID: \"a0fd34b83b170bf49f4fc19c252e8bbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.292778 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.292778 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a0fd34b83b170bf49f4fc19c252e8bbc-config\") pod \"kube-apiserver-proxy-ip-10-0-141-169.ec2.internal\" (UID: \"a0fd34b83b170bf49f4fc19c252e8bbc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.292778 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.292719 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/973b90e1e6a733f4365ef807b1bee69b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal\" (UID: \"973b90e1e6a733f4365ef807b1bee69b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.293611 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.293594 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.394370 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.394289 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.444511 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.444476 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.448261 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.448236 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:47.495104 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.495077 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.595586 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.595567 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.696041 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.695983 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.742284 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.742255 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:47.796768 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.796737 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.798886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.798868 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:47.799008 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.798992 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:47.799064 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.799030 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:47.799110 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.799034 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:47.886188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.886149 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:46 +0000 UTC" deadline="2027-11-13 15:56:56.018351452 +0000 UTC" Apr 16 22:13:47.886188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.886181 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13817h43m8.132173805s" Apr 16 22:13:47.889315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.889294 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:47.897857 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:47.897831 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-169.ec2.internal\" not found" Apr 16 22:13:47.904336 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.904312 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:47.943351 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:47.943320 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fd34b83b170bf49f4fc19c252e8bbc.slice/crio-995135b954a446e83afcd9327aef71f1fff3656fa65895a0b89d7dd0997fea3b WatchSource:0}: Error finding container 995135b954a446e83afcd9327aef71f1fff3656fa65895a0b89d7dd0997fea3b: Status 404 returned error can't find the container with id 995135b954a446e83afcd9327aef71f1fff3656fa65895a0b89d7dd0997fea3b Apr 16 22:13:47.943650 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.943632 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tpxst" Apr 16 22:13:47.947404 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:47.947350 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973b90e1e6a733f4365ef807b1bee69b.slice/crio-02c663a0987612504dbc3ac19ae29403dca63017b1ff1a69b604ad408596415b WatchSource:0}: Error finding container 02c663a0987612504dbc3ac19ae29403dca63017b1ff1a69b604ad408596415b: Status 404 returned error can't find the container with id 02c663a0987612504dbc3ac19ae29403dca63017b1ff1a69b604ad408596415b Apr 16 22:13:47.948008 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.947989 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:47.950914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.950835 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tpxst" Apr 16 22:13:47.995024 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:47.994988 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:48.016286 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.016223 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" event={"ID":"973b90e1e6a733f4365ef807b1bee69b","Type":"ContainerStarted","Data":"02c663a0987612504dbc3ac19ae29403dca63017b1ff1a69b604ad408596415b"} Apr 16 22:13:48.017158 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.017123 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" event={"ID":"a0fd34b83b170bf49f4fc19c252e8bbc","Type":"ContainerStarted","Data":"995135b954a446e83afcd9327aef71f1fff3656fa65895a0b89d7dd0997fea3b"} Apr 16 22:13:48.089363 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.089323 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" Apr 16 22:13:48.100181 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.100160 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:48.101126 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.101111 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" Apr 16 22:13:48.109382 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.109365 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:48.870097 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.870066 2565 apiserver.go:52] "Watching apiserver" Apr 16 22:13:48.878088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.878061 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:48.880295 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.880231 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-bdcch","openshift-ovn-kubernetes/ovnkube-node-tp6wl","kube-system/konnectivity-agent-dt42x","kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8","openshift-cluster-node-tuning-operator/tuned-dg4lb","openshift-image-registry/node-ca-wpfh2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal","openshift-multus/multus-additional-cni-plugins-8xzc6","openshift-network-operator/iptables-alerter-n5bvs","openshift-dns/node-resolver-tsbd9","openshift-multus/multus-sgsmj","openshift-multus/network-metrics-daemon-qz5vc"] Apr 16 22:13:48.880825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.880803 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:48.882724 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.882562 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:48.885036 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.885003 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.885036 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.885049 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:48.886220 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.886197 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.886930 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.886911 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.887028 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.886911 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.887112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.887095 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9sh2z\"" Apr 16 22:13:48.887520 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.887503 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.888442 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.888426 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:48.888540 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.888521 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.888775 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.888745 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.889135 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889118 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:48.889207 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889177 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:48.889672 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889619 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:48.889772 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889693 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.889834 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889814 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:48.889891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889859 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.889937 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.889628 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:48.890563 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.890062 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:48.890563 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.890210 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:48.890563 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.890374 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.891237 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.891207 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2mrwm\"" Apr 16 22:13:48.891320 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.891283 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k5xc6\"" Apr 16 22:13:48.891910 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.891887 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pppvr\"" Apr 16 22:13:48.892029 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.892009 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.892764 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.893263 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tqlck\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.893935 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.893986 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fqgp6\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.894103 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.894236 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.895515 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.895555 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.895579 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.895965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.895770 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.896519 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.896389 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:48.896573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.896539 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vpmb5\"" Apr 16 22:13:48.896573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.896565 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.896909 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.896892 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:48.897007 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.896975 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:48.897084 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:48.897065 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:48.899021 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.899003 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgsmj" Apr 16 22:13:48.900271 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.900253 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-czb2r\"" Apr 16 22:13:48.900649 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.900628 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:48.900731 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:48.900698 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:13:48.900980 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.900963 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:48.901484 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901466 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:48.901570 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901533 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:48.901682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-systemd-units\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.901773 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901757 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lff65\"" Apr 16 22:13:48.901857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901841 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.901964 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.901950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-device-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.902193 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902170 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a701b0d-e8cf-4242-a773-67ba88c764b5-host-slash\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:48.902300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-host\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.902300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902263 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902288 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.902480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902326 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-tuned\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.902480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902360 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-env-overrides\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902389 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71427d26-ce68-4714-9505-292c288a5fdf-ovn-node-metrics-cert\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902431 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.902480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902459 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a701b0d-e8cf-4242-a773-67ba88c764b5-iptables-alerter-script\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysconfig\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902504 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-tmp\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-netns\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902575 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-etc-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-systemd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902649 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902677 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.902723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902710 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-systemd\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902733 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-lib-modules\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902767 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmjxb\" (UniqueName: \"kubernetes.io/projected/71427d26-ce68-4714-9505-292c288a5fdf-kube-api-access-nmjxb\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-system-cni-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902833 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-os-release\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902858 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2327dcdf-f40d-43bf-905a-1404d6e339f7-host\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902882 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-slash\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902905 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-bin\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-node-log\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.902979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903004 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-var-lib-kubelet\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903053 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-conf\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903105 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgzg2\" (UniqueName: \"kubernetes.io/projected/e1bdc782-5278-4724-89af-c7bf4325aea4-kube-api-access-pgzg2\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903141 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2327dcdf-f40d-43bf-905a-1404d6e339f7-serviceca\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903168 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-log-socket\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-agent-certs\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903226 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-konnectivity-ca\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903259 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-cnibin\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903291 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903331 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-run\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903361 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-sys\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-socket-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903429 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903455 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-kubelet\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903483 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-var-lib-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-registration-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903534 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2k28\" (UniqueName: \"kubernetes.io/projected/db688c76-0e69-4f78-bb5a-3e477f2575d3-kube-api-access-g2k28\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.903828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903558 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxgxq\" (UniqueName: \"kubernetes.io/projected/3a701b0d-e8cf-4242-a773-67ba88c764b5-kube-api-access-zxgxq\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903581 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-kubernetes\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpcgr\" (UniqueName: \"kubernetes.io/projected/2327dcdf-f40d-43bf-905a-1404d6e339f7-kube-api-access-vpcgr\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903662 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-modprobe-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903690 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-netd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903714 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-script-lib\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903742 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-sys-fs\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903765 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92x2w\" (UniqueName: \"kubernetes.io/projected/122992fa-4992-414a-8573-d77e9afd6b29-kube-api-access-92x2w\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903787 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-ovn\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.904602 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.903811 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-config\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:48.952676 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.952621 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:47 +0000 UTC" deadline="2027-09-10 07:03:04.601165968 +0000 UTC" Apr 16 22:13:48.952676 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.952648 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12272h49m15.648522108s" Apr 16 22:13:48.990583 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:48.990557 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:49.004537 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004505 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2327dcdf-f40d-43bf-905a-1404d6e339f7-serviceca\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004552 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-log-socket\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004578 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-agent-certs\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004605 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-konnectivity-ca\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004624 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-log-socket\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004644 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-cnibin\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.004678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004671 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004727 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-cnibin\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004784 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-run\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-sys\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-socket-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004904 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-sys\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-run\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004926 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cni-binary-copy\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.004970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004956 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-kubelet\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-var-lib-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.004997 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005008 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-registration-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005012 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-kubelet\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005032 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2k28\" (UniqueName: \"kubernetes.io/projected/db688c76-0e69-4f78-bb5a-3e477f2575d3-kube-api-access-g2k28\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-var-lib-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxgxq\" (UniqueName: \"kubernetes.io/projected/3a701b0d-e8cf-4242-a773-67ba88c764b5-kube-api-access-zxgxq\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-registration-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005084 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-kubernetes\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-multus\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005184 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005202 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2327dcdf-f40d-43bf-905a-1404d6e339f7-serviceca\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpcgr\" (UniqueName: \"kubernetes.io/projected/2327dcdf-f40d-43bf-905a-1404d6e339f7-kube-api-access-vpcgr\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-modprobe-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-kubernetes\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-konnectivity-ca\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:49.005379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005308 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-socket-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005311 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-multus-certs\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005329 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-modprobe-d\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/969daa2e-581d-4248-8104-2e50544de6b9-tmp-dir\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005427 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-netd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005452 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-script-lib\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-sys-fs\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005494 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92x2w\" (UniqueName: \"kubernetes.io/projected/122992fa-4992-414a-8573-d77e9afd6b29-kube-api-access-92x2w\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-hostroot\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005522 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-netd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005543 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5vx\" (UniqueName: \"kubernetes.io/projected/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-kube-api-access-xs5vx\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005543 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-sys-fs\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005569 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-ovn\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005628 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-ovn\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005661 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-config\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-k8s-cni-cncf-io\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-kubelet\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005781 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-systemd-units\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-device-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005870 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-systemd-units\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005899 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a701b0d-e8cf-4242-a773-67ba88c764b5-host-slash\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005925 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-host\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a701b0d-e8cf-4242-a773-67ba88c764b5-host-slash\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005949 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005951 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-device-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005980 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-conf-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005936 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005988 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-host\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.005996 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-script-lib\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006037 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.006811 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-tuned\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006083 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-env-overrides\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006106 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71427d26-ce68-4714-9505-292c288a5fdf-ovn-node-metrics-cert\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a701b0d-e8cf-4242-a773-67ba88c764b5-iptables-alerter-script\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysconfig\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006203 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-tmp\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-ovnkube-config\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-socket-dir-parent\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006257 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-netns\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-etc-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006306 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-system-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006300 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysconfig\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006334 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-os-release\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006361 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77kt6\" (UniqueName: \"kubernetes.io/projected/e06e94b1-2063-48f2-b8a7-0d0e4193f064-kube-api-access-77kt6\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006389 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-systemd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.007578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006436 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-etc-openvswitch\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006463 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006489 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-systemd\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-lib-modules\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006548 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-netns\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmjxb\" (UniqueName: \"kubernetes.io/projected/71427d26-ce68-4714-9505-292c288a5fdf-kube-api-access-nmjxb\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-etc-kubernetes\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-system-cni-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006655 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-os-release\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006672 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71427d26-ce68-4714-9505-292c288a5fdf-env-overrides\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006682 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/969daa2e-581d-4248-8104-2e50544de6b9-hosts-file\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2327dcdf-f40d-43bf-905a-1404d6e339f7-host\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006759 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-systemd\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a701b0d-e8cf-4242-a773-67ba88c764b5-iptables-alerter-script\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.008432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-slash\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006890 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/db688c76-0e69-4f78-bb5a-3e477f2575d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006912 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-system-cni-dir\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006924 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2327dcdf-f40d-43bf-905a-1404d6e339f7-host\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006926 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-bin\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.006892 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-cni-bin\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007003 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/122992fa-4992-414a-8573-d77e9afd6b29-os-release\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007033 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-netns\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-run-systemd\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007134 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cnibin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007147 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-lib-modules\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007175 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-slash\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007182 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-bin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007215 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-daemon-config\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007248 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-node-log\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007275 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007311 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-node-log\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007311 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007445 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-var-lib-kubelet\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-conf\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgzg2\" (UniqueName: \"kubernetes.io/projected/e1bdc782-5278-4724-89af-c7bf4325aea4-kube-api-access-pgzg2\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlmhc\" (UniqueName: \"kubernetes.io/projected/969daa2e-581d-4248-8104-2e50544de6b9-kube-api-access-mlmhc\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007728 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007771 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71427d26-ce68-4714-9505-292c288a5fdf-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007808 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-var-lib-kubelet\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.007826 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-sysctl-conf\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.008145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/122992fa-4992-414a-8573-d77e9afd6b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.009174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-tmp\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.009322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/db66a2b0-751f-4d43-8765-6ecfb4ef22eb-agent-certs\") pod \"konnectivity-agent-dt42x\" (UID: \"db66a2b0-751f-4d43-8765-6ecfb4ef22eb\") " pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:49.009955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.009767 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71427d26-ce68-4714-9505-292c288a5fdf-ovn-node-metrics-cert\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.010527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.010492 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e1bdc782-5278-4724-89af-c7bf4325aea4-etc-tuned\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.010866 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.010780 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:49.010866 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.010802 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:49.010866 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.010817 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.011050 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.010884 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.510863335 +0000 UTC m=+3.056604763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.013366 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.013334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxgxq\" (UniqueName: \"kubernetes.io/projected/3a701b0d-e8cf-4242-a773-67ba88c764b5-kube-api-access-zxgxq\") pod \"iptables-alerter-n5bvs\" (UID: \"3a701b0d-e8cf-4242-a773-67ba88c764b5\") " pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.013595 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.013552 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2k28\" (UniqueName: \"kubernetes.io/projected/db688c76-0e69-4f78-bb5a-3e477f2575d3-kube-api-access-g2k28\") pod \"aws-ebs-csi-driver-node-zxkh8\" (UID: \"db688c76-0e69-4f78-bb5a-3e477f2575d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.015171 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.015150 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92x2w\" (UniqueName: \"kubernetes.io/projected/122992fa-4992-414a-8573-d77e9afd6b29-kube-api-access-92x2w\") pod \"multus-additional-cni-plugins-8xzc6\" (UID: \"122992fa-4992-414a-8573-d77e9afd6b29\") " pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.016041 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.016005 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpcgr\" (UniqueName: \"kubernetes.io/projected/2327dcdf-f40d-43bf-905a-1404d6e339f7-kube-api-access-vpcgr\") pod \"node-ca-wpfh2\" (UID: \"2327dcdf-f40d-43bf-905a-1404d6e339f7\") " pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.023356 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.023331 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmjxb\" (UniqueName: \"kubernetes.io/projected/71427d26-ce68-4714-9505-292c288a5fdf-kube-api-access-nmjxb\") pod \"ovnkube-node-tp6wl\" (UID: \"71427d26-ce68-4714-9505-292c288a5fdf\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.023498 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.023448 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgzg2\" (UniqueName: \"kubernetes.io/projected/e1bdc782-5278-4724-89af-c7bf4325aea4-kube-api-access-pgzg2\") pod \"tuned-dg4lb\" (UID: \"e1bdc782-5278-4724-89af-c7bf4325aea4\") " pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.108128 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108091 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-socket-dir-parent\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-system-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108167 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-os-release\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77kt6\" (UniqueName: \"kubernetes.io/projected/e06e94b1-2063-48f2-b8a7-0d0e4193f064-kube-api-access-77kt6\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108234 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-netns\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108236 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-socket-dir-parent\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-etc-kubernetes\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108261 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-system-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/969daa2e-581d-4248-8104-2e50544de6b9-hosts-file\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-etc-kubernetes\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-os-release\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108313 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cnibin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cnibin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108387 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-netns\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-bin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108428 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-bin\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-daemon-config\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108474 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/969daa2e-581d-4248-8104-2e50544de6b9-hosts-file\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108490 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlmhc\" (UniqueName: \"kubernetes.io/projected/969daa2e-581d-4248-8104-2e50544de6b9-kube-api-access-mlmhc\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cni-binary-copy\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-multus\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108611 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-cni-multus\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108674 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-multus-certs\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108696 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/969daa2e-581d-4248-8104-2e50544de6b9-tmp-dir\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.108706 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-hostroot\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.108738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-multus-certs\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5vx\" (UniqueName: \"kubernetes.io/projected/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-kube-api-access-xs5vx\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-k8s-cni-cncf-io\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108792 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-hostroot\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108806 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-kubelet\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108841 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108866 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-conf-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108872 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-run-k8s-cni-cncf-io\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108842 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-host-var-lib-kubelet\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.108892 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.608862963 +0000 UTC m=+3.154604434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108900 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-conf-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.108946 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-cni-dir\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.109527 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.109006 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/969daa2e-581d-4248-8104-2e50544de6b9-tmp-dir\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.109977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.109645 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-cni-binary-copy\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.110532 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.110514 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-multus-daemon-config\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.119720 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.119691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5vx\" (UniqueName: \"kubernetes.io/projected/33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9-kube-api-access-xs5vx\") pod \"multus-sgsmj\" (UID: \"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9\") " pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.120269 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.120227 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77kt6\" (UniqueName: \"kubernetes.io/projected/e06e94b1-2063-48f2-b8a7-0d0e4193f064-kube-api-access-77kt6\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.120357 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.120320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlmhc\" (UniqueName: \"kubernetes.io/projected/969daa2e-581d-4248-8104-2e50544de6b9-kube-api-access-mlmhc\") pod \"node-resolver-tsbd9\" (UID: \"969daa2e-581d-4248-8104-2e50544de6b9\") " pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.128483 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.128458 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:49.198650 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.198612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5bvs" Apr 16 22:13:49.207452 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.207422 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:13:49.216179 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.216157 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:13:49.223491 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.223472 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" Apr 16 22:13:49.231094 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.231074 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" Apr 16 22:13:49.239711 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.239693 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wpfh2" Apr 16 22:13:49.246189 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.246169 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" Apr 16 22:13:49.254744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.254722 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tsbd9" Apr 16 22:13:49.261359 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.261333 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgsmj" Apr 16 22:13:49.477073 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.476987 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:49.513274 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.513238 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:49.513455 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.513423 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:49.513455 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.513446 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:49.513455 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.513457 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.513591 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.513527 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.51351192 +0000 UTC m=+4.059253326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.534483 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.534453 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb688c76_0e69_4f78_bb5a_3e477f2575d3.slice/crio-70a4a72c7dfdd99921b06f70e7ac7838523d68a8524f14eeb8c06ffd6229f9c0 WatchSource:0}: Error finding container 70a4a72c7dfdd99921b06f70e7ac7838523d68a8524f14eeb8c06ffd6229f9c0: Status 404 returned error can't find the container with id 70a4a72c7dfdd99921b06f70e7ac7838523d68a8524f14eeb8c06ffd6229f9c0 Apr 16 22:13:49.537954 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.537599 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e6dd78_7ff6_4e30_aa7d_6f6c4ac533d9.slice/crio-c99c29bef5c3f6501841850ecc9ac6fb63ccb954d819c558d094b013269ed467 WatchSource:0}: Error finding container c99c29bef5c3f6501841850ecc9ac6fb63ccb954d819c558d094b013269ed467: Status 404 returned error can't find the container with id c99c29bef5c3f6501841850ecc9ac6fb63ccb954d819c558d094b013269ed467 Apr 16 22:13:49.537954 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.537844 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2327dcdf_f40d_43bf_905a_1404d6e339f7.slice/crio-c3a33184ce4ae0d0592c077a2bfe5ac87330e53b890444b44c034f9435cc05fc WatchSource:0}: Error finding container c3a33184ce4ae0d0592c077a2bfe5ac87330e53b890444b44c034f9435cc05fc: Status 404 returned error can't find the container with id c3a33184ce4ae0d0592c077a2bfe5ac87330e53b890444b44c034f9435cc05fc Apr 16 22:13:49.541839 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.541812 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bdc782_5278_4724_89af_c7bf4325aea4.slice/crio-9081c185058c2dc3e73464f87a4cf57a85f3622cdda37229c8ccef24dc102699 WatchSource:0}: Error finding container 9081c185058c2dc3e73464f87a4cf57a85f3622cdda37229c8ccef24dc102699: Status 404 returned error can't find the container with id 9081c185058c2dc3e73464f87a4cf57a85f3622cdda37229c8ccef24dc102699 Apr 16 22:13:49.542740 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.542717 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71427d26_ce68_4714_9505_292c288a5fdf.slice/crio-dddc526c2b13f03e2b5481f18c0eeba0f7e14b71cf41986c58e260b6d0998430 WatchSource:0}: Error finding container dddc526c2b13f03e2b5481f18c0eeba0f7e14b71cf41986c58e260b6d0998430: Status 404 returned error can't find the container with id dddc526c2b13f03e2b5481f18c0eeba0f7e14b71cf41986c58e260b6d0998430 Apr 16 22:13:49.543907 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.543886 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122992fa_4992_414a_8573_d77e9afd6b29.slice/crio-177891a9eac3d648ec0c6593766c71dd3f043b1c698c781e5ae3c9102b574cc3 WatchSource:0}: Error finding container 177891a9eac3d648ec0c6593766c71dd3f043b1c698c781e5ae3c9102b574cc3: Status 404 returned error can't find the container with id 177891a9eac3d648ec0c6593766c71dd3f043b1c698c781e5ae3c9102b574cc3 Apr 16 22:13:49.544490 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.544215 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb66a2b0_751f_4d43_8765_6ecfb4ef22eb.slice/crio-4ee79102078db133a26404dde328881e17928f328b79139001c641566f6e60a9 WatchSource:0}: Error finding container 4ee79102078db133a26404dde328881e17928f328b79139001c641566f6e60a9: Status 404 returned error can't find the container with id 4ee79102078db133a26404dde328881e17928f328b79139001c641566f6e60a9 Apr 16 22:13:49.545144 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.545116 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969daa2e_581d_4248_8104_2e50544de6b9.slice/crio-a104837539190e901cb1765bd188ad692a4200e36fdaeca84040d2a6281f646f WatchSource:0}: Error finding container a104837539190e901cb1765bd188ad692a4200e36fdaeca84040d2a6281f646f: Status 404 returned error can't find the container with id a104837539190e901cb1765bd188ad692a4200e36fdaeca84040d2a6281f646f Apr 16 22:13:49.546861 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:13:49.546480 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a701b0d_e8cf_4242_a773_67ba88c764b5.slice/crio-3f3cc0295c5754c5cd92bb6546f61f33dc7e3d58673d02e6f9540b597a98d8ad WatchSource:0}: Error finding container 3f3cc0295c5754c5cd92bb6546f61f33dc7e3d58673d02e6f9540b597a98d8ad: Status 404 returned error can't find the container with id 3f3cc0295c5754c5cd92bb6546f61f33dc7e3d58673d02e6f9540b597a98d8ad Apr 16 22:13:49.614262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.614110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:49.614364 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.614264 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.614364 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:49.614327 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:50.614311714 +0000 UTC m=+4.160053119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.953833 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.953683 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:47 +0000 UTC" deadline="2027-10-18 11:45:34.150880125 +0000 UTC" Apr 16 22:13:49.953833 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:49.953722 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13189h31m44.197161828s" Apr 16 22:13:50.028753 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.025429 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5bvs" event={"ID":"3a701b0d-e8cf-4242-a773-67ba88c764b5","Type":"ContainerStarted","Data":"3f3cc0295c5754c5cd92bb6546f61f33dc7e3d58673d02e6f9540b597a98d8ad"} Apr 16 22:13:50.030619 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.029207 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dt42x" event={"ID":"db66a2b0-751f-4d43-8765-6ecfb4ef22eb","Type":"ContainerStarted","Data":"4ee79102078db133a26404dde328881e17928f328b79139001c641566f6e60a9"} Apr 16 22:13:50.038122 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.038044 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" event={"ID":"e1bdc782-5278-4724-89af-c7bf4325aea4","Type":"ContainerStarted","Data":"9081c185058c2dc3e73464f87a4cf57a85f3622cdda37229c8ccef24dc102699"} Apr 16 22:13:50.046664 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.046630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgsmj" event={"ID":"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9","Type":"ContainerStarted","Data":"c99c29bef5c3f6501841850ecc9ac6fb63ccb954d819c558d094b013269ed467"} Apr 16 22:13:50.052112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.051716 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" event={"ID":"a0fd34b83b170bf49f4fc19c252e8bbc","Type":"ContainerStarted","Data":"0ae3397513fc04342fbe095cfa66dc247a3cae037c3a3aac726e988ee0fe2006"} Apr 16 22:13:50.054881 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.054814 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tsbd9" event={"ID":"969daa2e-581d-4248-8104-2e50544de6b9","Type":"ContainerStarted","Data":"a104837539190e901cb1765bd188ad692a4200e36fdaeca84040d2a6281f646f"} Apr 16 22:13:50.066670 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.066624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerStarted","Data":"177891a9eac3d648ec0c6593766c71dd3f043b1c698c781e5ae3c9102b574cc3"} Apr 16 22:13:50.076163 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.076124 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"dddc526c2b13f03e2b5481f18c0eeba0f7e14b71cf41986c58e260b6d0998430"} Apr 16 22:13:50.077724 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.077698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wpfh2" event={"ID":"2327dcdf-f40d-43bf-905a-1404d6e339f7","Type":"ContainerStarted","Data":"c3a33184ce4ae0d0592c077a2bfe5ac87330e53b890444b44c034f9435cc05fc"} Apr 16 22:13:50.079614 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.079590 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" event={"ID":"db688c76-0e69-4f78-bb5a-3e477f2575d3","Type":"ContainerStarted","Data":"70a4a72c7dfdd99921b06f70e7ac7838523d68a8524f14eeb8c06ffd6229f9c0"} Apr 16 22:13:50.523033 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.522524 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:50.523033 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.522673 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:50.523033 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.522695 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:50.523033 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.522713 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:50.523033 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.522767 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.522754478 +0000 UTC m=+6.068495883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:50.624636 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:50.624028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:50.624636 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.624173 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:50.624636 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:50.624233 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.624215788 +0000 UTC m=+6.169957209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:51.018794 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:51.018759 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:51.019211 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:51.018951 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:13:51.019492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:51.019472 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:51.019797 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:51.019772 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:51.094358 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:51.094320 2565 generic.go:358] "Generic (PLEG): container finished" podID="973b90e1e6a733f4365ef807b1bee69b" containerID="6ef2820bd9da0203aaaf5a737d6916e919068d2d4de4f78f2a7065cc12e844c9" exitCode=0 Apr 16 22:13:51.094525 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:51.094470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" event={"ID":"973b90e1e6a733f4365ef807b1bee69b","Type":"ContainerDied","Data":"6ef2820bd9da0203aaaf5a737d6916e919068d2d4de4f78f2a7065cc12e844c9"} Apr 16 22:13:51.111342 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:51.109944 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-169.ec2.internal" podStartSLOduration=3.109925621 podStartE2EDuration="3.109925621s" podCreationTimestamp="2026-04-16 22:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:50.066814781 +0000 UTC m=+3.612556209" watchObservedRunningTime="2026-04-16 22:13:51.109925621 +0000 UTC m=+4.655667039" Apr 16 22:13:52.103210 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:52.102518 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" event={"ID":"973b90e1e6a733f4365ef807b1bee69b","Type":"ContainerStarted","Data":"e0cda589c5246d6dec069610b0d1871ceaa245023cd14c79fece6e1835b31fc1"} Apr 16 22:13:52.541732 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:52.541647 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:52.541938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.541874 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:52.541938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.541901 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:52.541938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.541915 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.542161 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.541983 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:56.541963106 +0000 UTC m=+10.087704518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.642259 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:52.642208 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:52.642451 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.642384 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.642523 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:52.642466 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:56.6424462 +0000 UTC m=+10.188187605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:53.014333 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:53.014248 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:53.014517 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:53.014373 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:13:53.015831 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:53.015805 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:53.015951 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:53.015930 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:55.015286 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:55.015131 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:55.019215 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:55.019167 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:13:55.019215 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:55.019201 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:55.019423 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:55.019353 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:56.575526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:56.575488 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:56.575992 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.575662 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:56.575992 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.575683 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:56.575992 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.575696 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:56.575992 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.575756 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:04.575738892 +0000 UTC m=+18.121480303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:56.676953 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:56.676907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:56.677120 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.677050 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:56.677183 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:56.677121 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:04.677099872 +0000 UTC m=+18.222841288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:57.015939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:57.015217 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:57.015939 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:57.015345 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:13:57.015939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:57.015735 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:57.015939 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:57.015839 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:59.013975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:59.013703 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:13:59.013975 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:59.013853 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:13:59.013975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:13:59.013869 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:13:59.014616 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:13:59.013991 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:01.013833 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:01.013795 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:01.014275 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:01.013960 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:01.014275 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:01.014018 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:01.014275 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:01.014116 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:03.014180 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:03.014143 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:03.014593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:03.014192 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:03.014593 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:03.014255 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:03.014593 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:03.014309 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:04.634147 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:04.634107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:04.634733 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.634297 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:04.634733 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.634326 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:04.634733 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.634338 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:04.634733 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.634421 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.634385514 +0000 UTC m=+34.180126931 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:04.735210 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:04.735161 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:04.735395 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.735327 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:04.735480 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:04.735426 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.735387944 +0000 UTC m=+34.281129349 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:05.013879 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:05.013842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:05.014060 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:05.013846 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:05.014060 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:05.013967 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:05.014060 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:05.014032 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:07.015128 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.014777 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:07.015937 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.014833 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:07.015937 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:07.015243 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:07.015937 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:07.015321 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:07.131441 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.131382 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgsmj" event={"ID":"33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9","Type":"ContainerStarted","Data":"5f23b56003cb1257c5f94fc946ad72c0a3c59e472386655ab4823de5a87b92fd"} Apr 16 22:14:07.133024 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.132974 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tsbd9" event={"ID":"969daa2e-581d-4248-8104-2e50544de6b9","Type":"ContainerStarted","Data":"73330faeff5747adbe51b76503a708145ab2098024d637a6071802a058ae675b"} Apr 16 22:14:07.134513 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.134489 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="6f15573e606a57a6bf77b67b4ff3248d80c2b9437a5dcabb87d5958a4e56886b" exitCode=0 Apr 16 22:14:07.134644 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.134557 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"6f15573e606a57a6bf77b67b4ff3248d80c2b9437a5dcabb87d5958a4e56886b"} Apr 16 22:14:07.137589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137524 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"2311c6dcdde467e75f6ec448f39af7b610833d2b30454869ad3b1d319bee21ec"} Apr 16 22:14:07.137589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137553 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"d517b8324e9a529417ea09507aaeb581ecf9ff1b3f681f7ceeb0139165d739fa"} Apr 16 22:14:07.137589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137567 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"3bf60240899a37881e7d4dd16a4c6ad6150a3f651249998d2c015b767b4d687a"} Apr 16 22:14:07.137589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137579 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"bf73ac1c823f6d2733f2ef0ff3359de0ee734c890bd4d3b6af359c1ee86d08bc"} Apr 16 22:14:07.137589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137592 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"f2fdf1e10c9f321f5d94ab99045e2895cf2cfb0b369184415084ea9746b59a46"} Apr 16 22:14:07.137903 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.137604 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"3eee4bbb4706103c7581796dd9cd4799fc37affc640e8348763ca41a686223fb"} Apr 16 22:14:07.139696 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.139673 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wpfh2" event={"ID":"2327dcdf-f40d-43bf-905a-1404d6e339f7","Type":"ContainerStarted","Data":"4421ad712794884eb9c7264db3cffda6643a64d7c52518ffaed483c967ed809d"} Apr 16 22:14:07.142110 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.142082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" event={"ID":"db688c76-0e69-4f78-bb5a-3e477f2575d3","Type":"ContainerStarted","Data":"4eb658f15792b34539d29643d663063145be490c4d547c3de4277559a477f202"} Apr 16 22:14:07.143495 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.143462 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dt42x" event={"ID":"db66a2b0-751f-4d43-8765-6ecfb4ef22eb","Type":"ContainerStarted","Data":"571c80c851a2eed73cff2d65b32de82a17c73b1a72340795e5762e648f282988"} Apr 16 22:14:07.144655 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.144630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" event={"ID":"e1bdc782-5278-4724-89af-c7bf4325aea4","Type":"ContainerStarted","Data":"181f281ad5e204a161e4f1cf50d11f75685a42b36856e600a677e6e27c13b80c"} Apr 16 22:14:07.151553 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.151508 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-169.ec2.internal" podStartSLOduration=19.151493761 podStartE2EDuration="19.151493761s" podCreationTimestamp="2026-04-16 22:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:52.1223875 +0000 UTC m=+5.668128928" watchObservedRunningTime="2026-04-16 22:14:07.151493761 +0000 UTC m=+20.697235189" Apr 16 22:14:07.173130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.173091 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sgsmj" podStartSLOduration=3.6371727209999998 podStartE2EDuration="20.173077572s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.540452427 +0000 UTC m=+3.086193831" lastFinishedPulling="2026-04-16 22:14:06.076357272 +0000 UTC m=+19.622098682" observedRunningTime="2026-04-16 22:14:07.152793721 +0000 UTC m=+20.698535149" watchObservedRunningTime="2026-04-16 22:14:07.173077572 +0000 UTC m=+20.718818998" Apr 16 22:14:07.187378 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.187342 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wpfh2" podStartSLOduration=3.688672472 podStartE2EDuration="20.187329758s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.540576669 +0000 UTC m=+3.086318074" lastFinishedPulling="2026-04-16 22:14:06.039233948 +0000 UTC m=+19.584975360" observedRunningTime="2026-04-16 22:14:07.187218419 +0000 UTC m=+20.732959846" watchObservedRunningTime="2026-04-16 22:14:07.187329758 +0000 UTC m=+20.733071184" Apr 16 22:14:07.187921 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.187892 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dt42x" podStartSLOduration=3.694433849 podStartE2EDuration="20.187883095s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.546166825 +0000 UTC m=+3.091908236" lastFinishedPulling="2026-04-16 22:14:06.039616075 +0000 UTC m=+19.585357482" observedRunningTime="2026-04-16 22:14:07.172836861 +0000 UTC m=+20.718578291" watchObservedRunningTime="2026-04-16 22:14:07.187883095 +0000 UTC m=+20.733624523" Apr 16 22:14:07.203579 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.203554 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:14:07.204344 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.204329 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:14:07.204945 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.204924 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:07.205228 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.205196 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dg4lb" podStartSLOduration=3.706574161 podStartE2EDuration="20.205183916s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.543617837 +0000 UTC m=+3.089359246" lastFinishedPulling="2026-04-16 22:14:06.042227582 +0000 UTC m=+19.587969001" observedRunningTime="2026-04-16 22:14:07.204691847 +0000 UTC m=+20.750433274" watchObservedRunningTime="2026-04-16 22:14:07.205183916 +0000 UTC m=+20.750925344" Apr 16 22:14:07.249564 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.249516 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tsbd9" podStartSLOduration=3.757276715 podStartE2EDuration="20.249503342s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.547484738 +0000 UTC m=+3.093226143" lastFinishedPulling="2026-04-16 22:14:06.039711358 +0000 UTC m=+19.585452770" observedRunningTime="2026-04-16 22:14:07.249479678 +0000 UTC m=+20.795221127" watchObservedRunningTime="2026-04-16 22:14:07.249503342 +0000 UTC m=+20.795244766" Apr 16 22:14:07.975321 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.975220 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:07.204941056Z","UUID":"8c061cec-da4d-4339-859c-03a8127bbddb","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:07.977906 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.977884 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:07.978037 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:07.977915 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:08.149111 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.149021 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" event={"ID":"db688c76-0e69-4f78-bb5a-3e477f2575d3","Type":"ContainerStarted","Data":"589a078bbb5b738970b5e1e542f89a192eb66d8ac82f82636719d9cf570fdbdb"} Apr 16 22:14:08.149111 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.149066 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" event={"ID":"db688c76-0e69-4f78-bb5a-3e477f2575d3","Type":"ContainerStarted","Data":"c489d19969caf4c02d12898eba73ffa8286b69d3947f653c940d391738ceee4d"} Apr 16 22:14:08.150645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.150531 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5bvs" event={"ID":"3a701b0d-e8cf-4242-a773-67ba88c764b5","Type":"ContainerStarted","Data":"aa2a8e81735374ef78e8ee7d8c944df1f44a2f1bb55ef2bc4c6726b9ff2dc7ef"} Apr 16 22:14:08.151140 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.151103 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:14:08.151473 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.151452 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dt42x" Apr 16 22:14:08.179169 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.179119 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zxkh8" podStartSLOduration=2.827379712 podStartE2EDuration="21.179100431s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.537814144 +0000 UTC m=+3.083555563" lastFinishedPulling="2026-04-16 22:14:07.889534863 +0000 UTC m=+21.435276282" observedRunningTime="2026-04-16 22:14:08.165737523 +0000 UTC m=+21.711478950" watchObservedRunningTime="2026-04-16 22:14:08.179100431 +0000 UTC m=+21.724841872" Apr 16 22:14:08.193426 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:08.193363 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n5bvs" podStartSLOduration=4.702010924 podStartE2EDuration="21.193351799s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.547897659 +0000 UTC m=+3.093639064" lastFinishedPulling="2026-04-16 22:14:06.039238517 +0000 UTC m=+19.584979939" observedRunningTime="2026-04-16 22:14:08.19306065 +0000 UTC m=+21.738802078" watchObservedRunningTime="2026-04-16 22:14:08.193351799 +0000 UTC m=+21.739093225" Apr 16 22:14:09.017599 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:09.017281 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:09.017599 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:09.017281 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:09.017787 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:09.017661 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:09.017833 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:09.017789 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:09.155670 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:09.155636 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"1cefb0dc8589f915e848e4d6bf02de633fa048b78d39374940ee7c290eaa173e"} Apr 16 22:14:11.014262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.014231 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:11.014825 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:11.014359 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:11.014825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.014402 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:11.014825 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:11.014523 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:11.163300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.163103 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" event={"ID":"71427d26-ce68-4714-9505-292c288a5fdf","Type":"ContainerStarted","Data":"b4ce60ee2da121b88be8b6feb415f07ecc0dbfac16cc4b88df3147bfc9ec09ac"} Apr 16 22:14:11.163633 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.163462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:11.163633 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.163492 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:11.163633 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.163505 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:11.180358 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.180330 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:11.180955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.180941 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:11.188835 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:11.188640 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" podStartSLOduration=7.435304486 podStartE2EDuration="24.188626397s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.544906211 +0000 UTC m=+3.090647617" lastFinishedPulling="2026-04-16 22:14:06.298228121 +0000 UTC m=+19.843969528" observedRunningTime="2026-04-16 22:14:11.188394217 +0000 UTC m=+24.734135656" watchObservedRunningTime="2026-04-16 22:14:11.188626397 +0000 UTC m=+24.734367823" Apr 16 22:14:12.167096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:12.166895 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="7ee5edeea8788321c8daa5bb73d05c16ab9fafb92853d341796633df9c9060f5" exitCode=0 Apr 16 22:14:12.167806 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:12.166983 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"7ee5edeea8788321c8daa5bb73d05c16ab9fafb92853d341796633df9c9060f5"} Apr 16 22:14:13.013678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.013649 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:13.013678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.013682 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:13.013923 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:13.013752 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:13.013923 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:13.013853 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:13.047609 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.047571 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qz5vc"] Apr 16 22:14:13.049871 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.049845 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bdcch"] Apr 16 22:14:13.168970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.168942 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:13.168970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:13.168966 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:13.169455 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:13.169078 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:13.169523 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:13.169502 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:14.172495 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:14.172463 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="ff8cf32624a7ccec6e768f4eef7a3f39762bdd03c0a896cdba7b7ff79d8969d0" exitCode=0 Apr 16 22:14:14.172858 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:14.172523 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"ff8cf32624a7ccec6e768f4eef7a3f39762bdd03c0a896cdba7b7ff79d8969d0"} Apr 16 22:14:15.014317 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:15.014280 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:15.014506 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:15.014391 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:15.014506 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:15.014478 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:15.014623 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:15.014575 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:16.178086 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:16.177900 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="bcf7e5d9383e62920d24af7f4acb689f1464759058ec29e2191ea76f6ba78a27" exitCode=0 Apr 16 22:14:16.178086 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:16.177989 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"bcf7e5d9383e62920d24af7f4acb689f1464759058ec29e2191ea76f6ba78a27"} Apr 16 22:14:17.014504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:17.014478 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:17.014699 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:17.014609 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:17.014699 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:17.014686 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:17.014813 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:17.014780 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:19.014100 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.014026 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:19.014100 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.014071 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:19.014778 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.014172 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:14:19.014778 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.014313 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bdcch" podUID="0b5f6846-8363-4956-b563-df34509912b0" Apr 16 22:14:19.309394 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.309316 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-169.ec2.internal" event="NodeReady" Apr 16 22:14:19.309580 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.309481 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:19.355830 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.355799 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-86xzj"] Apr 16 22:14:19.382206 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.382173 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gxxg4"] Apr 16 22:14:19.382366 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.382337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.385300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.385273 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:19.385733 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.385705 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:19.385848 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.385751 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:14:19.398182 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.398162 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-86xzj"] Apr 16 22:14:19.398182 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.398184 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxxg4"] Apr 16 22:14:19.398350 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.398274 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:19.401677 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.401655 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:19.401774 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.401714 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:19.401774 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.401655 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:14:19.401896 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.401773 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:19.547703 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547667 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:19.547886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547746 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/94b66058-cae9-48ec-b576-71611c7b606e-tmp-dir\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.547886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547789 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzhs\" (UniqueName: \"kubernetes.io/projected/e4a447af-7f68-4189-bc97-af653fe8ba76-kube-api-access-tbzhs\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:19.547886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547808 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95t6\" (UniqueName: \"kubernetes.io/projected/94b66058-cae9-48ec-b576-71611c7b606e-kube-api-access-b95t6\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.547886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547826 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.547886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.547846 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b66058-cae9-48ec-b576-71611c7b606e-config-volume\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649032 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.648937 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/94b66058-cae9-48ec-b576-71611c7b606e-tmp-dir\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649032 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.648979 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzhs\" (UniqueName: \"kubernetes.io/projected/e4a447af-7f68-4189-bc97-af653fe8ba76-kube-api-access-tbzhs\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:19.649032 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649003 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b95t6\" (UniqueName: \"kubernetes.io/projected/94b66058-cae9-48ec-b576-71611c7b606e-kube-api-access-b95t6\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649032 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649029 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649060 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b66058-cae9-48ec-b576-71611c7b606e-config-volume\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649093 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:19.649283 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.649209 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:19.649283 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.649275 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.149255196 +0000 UTC m=+33.694996615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:19.649500 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.649213 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:19.649500 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649330 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/94b66058-cae9-48ec-b576-71611c7b606e-tmp-dir\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.649500 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:19.649342 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:20.149328488 +0000 UTC m=+33.695069909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:19.649681 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.649661 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94b66058-cae9-48ec-b576-71611c7b606e-config-volume\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.660679 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.660650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95t6\" (UniqueName: \"kubernetes.io/projected/94b66058-cae9-48ec-b576-71611c7b606e-kube-api-access-b95t6\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:19.660819 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:19.660661 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzhs\" (UniqueName: \"kubernetes.io/projected/e4a447af-7f68-4189-bc97-af653fe8ba76-kube-api-access-tbzhs\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:20.043567 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.043531 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb"] Apr 16 22:14:20.070552 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.070518 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb"] Apr 16 22:14:20.070731 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.070670 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.074226 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.074199 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 22:14:20.075046 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.074434 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:20.075182 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.075126 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:20.075341 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.075318 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:20.077198 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.075550 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-bncmf\"" Apr 16 22:14:20.153907 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.153859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kds5l\" (UniqueName: \"kubernetes.io/projected/b3841cbb-cef3-4f92-8234-40baf97239eb-kube-api-access-kds5l\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.154088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.153926 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:20.154088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.153958 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b3841cbb-cef3-4f92-8234-40baf97239eb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.154088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.153998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:20.154207 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.154107 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:20.154207 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.154166 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.154151535 +0000 UTC m=+34.699892940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:20.154305 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.154245 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:20.154345 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.154313 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:21.154295337 +0000 UTC m=+34.700036757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:20.254453 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.254396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kds5l\" (UniqueName: \"kubernetes.io/projected/b3841cbb-cef3-4f92-8234-40baf97239eb-kube-api-access-kds5l\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.254631 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.254471 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b3841cbb-cef3-4f92-8234-40baf97239eb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.257293 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.257259 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b3841cbb-cef3-4f92-8234-40baf97239eb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.263639 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.263612 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kds5l\" (UniqueName: \"kubernetes.io/projected/b3841cbb-cef3-4f92-8234-40baf97239eb-kube-api-access-kds5l\") pod \"managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb\" (UID: \"b3841cbb-cef3-4f92-8234-40baf97239eb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.393418 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.393322 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" Apr 16 22:14:20.658361 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.658276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:20.658537 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.658492 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:20.658537 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.658518 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:20.658537 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.658529 2565 projected.go:194] Error preparing data for projected volume kube-api-access-n5z6k for pod openshift-network-diagnostics/network-check-target-bdcch: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:20.658694 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.658599 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k podName:0b5f6846-8363-4956-b563-df34509912b0 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:52.658578333 +0000 UTC m=+66.204319751 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-n5z6k" (UniqueName: "kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k") pod "network-check-target-bdcch" (UID: "0b5f6846-8363-4956-b563-df34509912b0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:20.759334 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:20.759275 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:20.759539 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.759447 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:20.759539 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:20.759521 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:52.759502615 +0000 UTC m=+66.305244033 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:21.014308 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.014270 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:21.014505 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.014308 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:21.017079 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.017058 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:21.017227 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.017089 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:14:21.018017 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.017992 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:21.018017 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.018013 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:21.018178 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.017992 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vqjpn\"" Apr 16 22:14:21.163316 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.163282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:21.163786 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.163358 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:21.163786 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:21.163467 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:21.163786 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:21.163495 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:21.163786 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:21.163531 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.163512167 +0000 UTC m=+36.709253581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:21.163786 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:21.163548 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.16354099 +0000 UTC m=+36.709282397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:21.937751 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:21.937570 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb"] Apr 16 22:14:22.019098 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:14:22.019008 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3841cbb_cef3_4f92_8234_40baf97239eb.slice/crio-5e73d983ec230b25db806f7341e647955178d05a9c586a32002dc7685b296f8d WatchSource:0}: Error finding container 5e73d983ec230b25db806f7341e647955178d05a9c586a32002dc7685b296f8d: Status 404 returned error can't find the container with id 5e73d983ec230b25db806f7341e647955178d05a9c586a32002dc7685b296f8d Apr 16 22:14:22.192319 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:22.192283 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerStarted","Data":"0a6eabde1d3f5fc7e4147a5e89e05e973fdadaaa5d5426c091d7f5f5712fba4c"} Apr 16 22:14:22.193315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:22.193288 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" event={"ID":"b3841cbb-cef3-4f92-8234-40baf97239eb","Type":"ContainerStarted","Data":"5e73d983ec230b25db806f7341e647955178d05a9c586a32002dc7685b296f8d"} Apr 16 22:14:23.178348 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:23.178268 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:23.178560 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:23.178482 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:23.178623 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:23.178565 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:27.178542921 +0000 UTC m=+40.724284347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:23.178685 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:23.178638 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:23.178777 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:23.178746 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:23.178831 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:23.178792 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:27.17878298 +0000 UTC m=+40.724524385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:23.199322 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:23.199288 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="0a6eabde1d3f5fc7e4147a5e89e05e973fdadaaa5d5426c091d7f5f5712fba4c" exitCode=0 Apr 16 22:14:23.200982 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:23.199363 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"0a6eabde1d3f5fc7e4147a5e89e05e973fdadaaa5d5426c091d7f5f5712fba4c"} Apr 16 22:14:24.204130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:24.204091 2565 generic.go:358] "Generic (PLEG): container finished" podID="122992fa-4992-414a-8573-d77e9afd6b29" containerID="1d8ae10b97e0ec6a1e642f48cc363ee41f14dc940d9bf72ae1af6e20bc76f530" exitCode=0 Apr 16 22:14:24.204637 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:24.204152 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerDied","Data":"1d8ae10b97e0ec6a1e642f48cc363ee41f14dc940d9bf72ae1af6e20bc76f530"} Apr 16 22:14:26.209669 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:26.209627 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" event={"ID":"b3841cbb-cef3-4f92-8234-40baf97239eb","Type":"ContainerStarted","Data":"ae7110c4e651a570ae44cdf928c810e5c4e928477b7107ef3b5ee3b11998029f"} Apr 16 22:14:26.212512 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:26.212484 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" event={"ID":"122992fa-4992-414a-8573-d77e9afd6b29","Type":"ContainerStarted","Data":"ce07cfad0d7b893c5dd83ca0afc317fe1c4e695be2f8123b974843f5a137fd39"} Apr 16 22:14:26.224794 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:26.224739 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" podStartSLOduration=2.7040355209999998 podStartE2EDuration="6.224724414s" podCreationTimestamp="2026-04-16 22:14:20 +0000 UTC" firstStartedPulling="2026-04-16 22:14:22.02405366 +0000 UTC m=+35.569795065" lastFinishedPulling="2026-04-16 22:14:25.54474255 +0000 UTC m=+39.090483958" observedRunningTime="2026-04-16 22:14:26.224719603 +0000 UTC m=+39.770461031" watchObservedRunningTime="2026-04-16 22:14:26.224724414 +0000 UTC m=+39.770465842" Apr 16 22:14:26.245026 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:26.244987 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8xzc6" podStartSLOduration=6.743724201 podStartE2EDuration="39.244973456s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:13:49.545218124 +0000 UTC m=+3.090959538" lastFinishedPulling="2026-04-16 22:14:22.046467382 +0000 UTC m=+35.592208793" observedRunningTime="2026-04-16 22:14:26.24339835 +0000 UTC m=+39.789139776" watchObservedRunningTime="2026-04-16 22:14:26.244973456 +0000 UTC m=+39.790714882" Apr 16 22:14:27.210594 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:27.210540 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:27.211029 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:27.210632 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:27.211029 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:27.210709 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:27.211029 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:27.210707 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:27.211029 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:27.210759 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:35.210746562 +0000 UTC m=+48.756487967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:27.211029 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:27.210773 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:35.210766841 +0000 UTC m=+48.756508245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:35.269069 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:35.269018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:35.269628 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:35.269136 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:35.269628 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:35.269179 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:35.269628 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:35.269232 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:35.269628 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:35.269250 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:51.269230669 +0000 UTC m=+64.814972088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:35.269628 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:35.269266 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:51.269256959 +0000 UTC m=+64.814998383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:43.183605 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:43.183570 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp6wl" Apr 16 22:14:51.281787 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:51.281750 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:14:51.281787 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:51.281799 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:14:51.282317 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:51.281918 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:51.282317 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:51.282016 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:51.282317 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:51.282019 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:23.281989937 +0000 UTC m=+96.827731343 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:51.282317 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:51.282070 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:23.282058353 +0000 UTC m=+96.827799759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:14:52.691336 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.691291 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:52.693853 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.693834 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:52.703663 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.703635 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:52.715927 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.715896 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5z6k\" (UniqueName: \"kubernetes.io/projected/0b5f6846-8363-4956-b563-df34509912b0-kube-api-access-n5z6k\") pod \"network-check-target-bdcch\" (UID: \"0b5f6846-8363-4956-b563-df34509912b0\") " pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:52.792074 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.792036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:14:52.794556 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.794536 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:52.802268 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:52.802244 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:52.802337 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:14:52.802306 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.802288363 +0000 UTC m=+130.348029767 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : secret "metrics-daemon-secret" not found Apr 16 22:14:52.833709 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.833680 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vqjpn\"" Apr 16 22:14:52.841539 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.841519 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:52.957144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:52.957066 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bdcch"] Apr 16 22:14:52.960925 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:14:52.960897 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5f6846_8363_4956_b563_df34509912b0.slice/crio-31edf53c936c8c6c8dae4120a73b738128031893df3fa88037a4ee6418328ecf WatchSource:0}: Error finding container 31edf53c936c8c6c8dae4120a73b738128031893df3fa88037a4ee6418328ecf: Status 404 returned error can't find the container with id 31edf53c936c8c6c8dae4120a73b738128031893df3fa88037a4ee6418328ecf Apr 16 22:14:53.267374 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:53.267284 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bdcch" event={"ID":"0b5f6846-8363-4956-b563-df34509912b0","Type":"ContainerStarted","Data":"31edf53c936c8c6c8dae4120a73b738128031893df3fa88037a4ee6418328ecf"} Apr 16 22:14:56.274086 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:56.274054 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bdcch" event={"ID":"0b5f6846-8363-4956-b563-df34509912b0","Type":"ContainerStarted","Data":"5c9344dc193ad48e5f20fa1250d86fc97708a3f81f25b13d45499fafb2453c6d"} Apr 16 22:14:56.274570 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:56.274195 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:14:56.289380 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:14:56.289335 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bdcch" podStartSLOduration=66.340024464 podStartE2EDuration="1m9.289322737s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:14:52.962857947 +0000 UTC m=+66.508599352" lastFinishedPulling="2026-04-16 22:14:55.912156219 +0000 UTC m=+69.457897625" observedRunningTime="2026-04-16 22:14:56.288068782 +0000 UTC m=+69.833810208" watchObservedRunningTime="2026-04-16 22:14:56.289322737 +0000 UTC m=+69.835064163" Apr 16 22:15:23.308374 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:23.308318 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:15:23.308374 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:23.308379 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:15:23.308842 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:23.308481 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:23.308842 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:23.308545 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls podName:94b66058-cae9-48ec-b576-71611c7b606e nodeName:}" failed. No retries permitted until 2026-04-16 22:16:27.308530333 +0000 UTC m=+160.854271737 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls") pod "dns-default-86xzj" (UID: "94b66058-cae9-48ec-b576-71611c7b606e") : secret "dns-default-metrics-tls" not found Apr 16 22:15:23.308842 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:23.308489 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:23.308842 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:23.308639 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert podName:e4a447af-7f68-4189-bc97-af653fe8ba76 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:27.308628581 +0000 UTC m=+160.854369986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert") pod "ingress-canary-gxxg4" (UID: "e4a447af-7f68-4189-bc97-af653fe8ba76") : secret "canary-serving-cert" not found Apr 16 22:15:27.278482 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:27.278452 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bdcch" Apr 16 22:15:56.839660 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:56.839597 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:15:56.840250 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:56.839770 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:56.840250 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:56.839891 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs podName:e06e94b1-2063-48f2-b8a7-0d0e4193f064 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:58.839868354 +0000 UTC m=+252.385609760 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs") pod "network-metrics-daemon-qz5vc" (UID: "e06e94b1-2063-48f2-b8a7-0d0e4193f064") : secret "metrics-daemon-secret" not found Apr 16 22:15:59.736254 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.736223 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h"] Apr 16 22:15:59.738690 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.738674 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.741717 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.741688 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.741868 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.741693 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 22:15:59.741868 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.741736 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-4v2kq\"" Apr 16 22:15:59.742038 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.742009 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz"] Apr 16 22:15:59.742130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.742110 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 22:15:59.742186 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.742157 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.744664 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.744644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.746644 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.746618 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68f7bcf4cd-4j647"] Apr 16 22:15:59.753446 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753397 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:15:59.753589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753473 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.753589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753485 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.753589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753535 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-w84kv\"" Apr 16 22:15:59.753748 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753474 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 22:15:59.753748 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753716 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 22:15:59.753913 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.753896 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.756227 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.756210 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h"] Apr 16 22:15:59.756325 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.756298 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.756817 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.756798 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6srfn\"" Apr 16 22:15:59.757238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757217 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 22:15:59.757328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757241 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 22:15:59.757328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757227 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:15:59.757328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757291 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 22:15:59.757820 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757802 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 22:15:59.757820 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.757813 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:15:59.758509 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.758494 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8tzbx\"" Apr 16 22:15:59.758601 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.758544 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:15:59.759340 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.759319 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:15:59.759555 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.759389 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:15:59.761158 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.761137 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz"] Apr 16 22:15:59.764997 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.764974 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:15:59.768670 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.768644 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:15:59.769914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.769873 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68f7bcf4cd-4j647"] Apr 16 22:15:59.858091 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858056 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5gl\" (UniqueName: \"kubernetes.io/projected/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-kube-api-access-9k5gl\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.858091 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858089 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858112 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858152 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858187 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858242 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858266 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858353 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-stats-auth\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858381 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-default-certificate\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5450699d-c1e9-4234-a7b9-c9440f986830-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858425 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntx9b\" (UniqueName: \"kubernetes.io/projected/5450699d-c1e9-4234-a7b9-c9440f986830-kube-api-access-ntx9b\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858474 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.858534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858508 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5450699d-c1e9-4234-a7b9-c9440f986830-config\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.858723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858562 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.858723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858625 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwpr\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.858723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.858663 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4cx\" (UniqueName: \"kubernetes.io/projected/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-kube-api-access-cx4cx\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.959585 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959548 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.959585 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959583 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5450699d-c1e9-4234-a7b9-c9440f986830-config\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.959785 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.959691 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:59.959785 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.959755 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.45973931 +0000 UTC m=+134.005480715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : secret "router-metrics-certs-default" not found Apr 16 22:15:59.959857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.959893 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.959923 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959904 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.959963 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959929 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shwpr\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.959963 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.959956 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4cx\" (UniqueName: \"kubernetes.io/projected/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-kube-api-access-cx4cx\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.960046 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.959989 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:59.960046 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960005 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5gl\" (UniqueName: \"kubernetes.io/projected/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-kube-api-access-9k5gl\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.960046 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960033 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.960055 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls podName:5a0b0ab0-57d0-4640-93b0-5859a8af3aa5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.460039392 +0000 UTC m=+134.005780807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p52zz" (UID: "5a0b0ab0-57d0-4640-93b0-5859a8af3aa5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960097 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960129 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960153 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960178 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960185 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960197 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.960197 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960242 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5450699d-c1e9-4234-a7b9-c9440f986830-config\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.960211 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fb78ff8db-pl67h: secret "image-registry-tls" not found Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.960254 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.460238786 +0000 UTC m=+134.005980191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:15:59.960378 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls podName:d1c8a303-1d3d-4a9d-b1b2-9f6166516cad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:00.460348937 +0000 UTC m=+134.006090343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls") pod "image-registry-7fb78ff8db-pl67h" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad") : secret "image-registry-tls" not found Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-stats-auth\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960494 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-default-certificate\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960521 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5450699d-c1e9-4234-a7b9-c9440f986830-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.960632 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960549 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntx9b\" (UniqueName: \"kubernetes.io/projected/5450699d-c1e9-4234-a7b9-c9440f986830-kube-api-access-ntx9b\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.961120 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.960802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.961120 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.961019 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.961224 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.961126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.962831 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.962806 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.962952 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.962936 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-stats-auth\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.963059 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.963029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-default-certificate\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.963168 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.963151 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5450699d-c1e9-4234-a7b9-c9440f986830-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:15:59.963307 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.963290 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.972654 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.972630 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.972893 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.972869 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4cx\" (UniqueName: \"kubernetes.io/projected/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-kube-api-access-cx4cx\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:15:59.973078 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.973061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwpr\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:15:59.974593 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.974574 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5gl\" (UniqueName: \"kubernetes.io/projected/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-kube-api-access-9k5gl\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:15:59.974786 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:15:59.974758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntx9b\" (UniqueName: \"kubernetes.io/projected/5450699d-c1e9-4234-a7b9-c9440f986830-kube-api-access-ntx9b\") pod \"service-ca-operator-d6fc45fc5-pnp6h\" (UID: \"5450699d-c1e9-4234-a7b9-c9440f986830\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:16:00.051464 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.051364 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" Apr 16 22:16:00.167387 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.167356 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h"] Apr 16 22:16:00.170628 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:00.170599 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5450699d_c1e9_4234_a7b9_c9440f986830.slice/crio-550b95604839df8114258c0a7ea0f0463bdd7a44bac972c2e217db1bba9a15d5 WatchSource:0}: Error finding container 550b95604839df8114258c0a7ea0f0463bdd7a44bac972c2e217db1bba9a15d5: Status 404 returned error can't find the container with id 550b95604839df8114258c0a7ea0f0463bdd7a44bac972c2e217db1bba9a15d5 Apr 16 22:16:00.391463 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.391355 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" event={"ID":"5450699d-c1e9-4234-a7b9-c9440f986830","Type":"ContainerStarted","Data":"550b95604839df8114258c0a7ea0f0463bdd7a44bac972c2e217db1bba9a15d5"} Apr 16 22:16:00.463658 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.463617 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:00.463658 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.463660 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.463699 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:00.463731 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463771 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463791 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fb78ff8db-pl67h: secret "image-registry-tls" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463824 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463842 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls podName:d1c8a303-1d3d-4a9d-b1b2-9f6166516cad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.463825983 +0000 UTC m=+135.009567387 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls") pod "image-registry-7fb78ff8db-pl67h" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad") : secret "image-registry-tls" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463851 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463876 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls podName:5a0b0ab0-57d0-4640-93b0-5859a8af3aa5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.463863867 +0000 UTC m=+135.009605276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p52zz" (UID: "5a0b0ab0-57d0-4640-93b0-5859a8af3aa5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463894 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.463881801 +0000 UTC m=+135.009623209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:00.463912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:00.463913 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.463906121 +0000 UTC m=+135.009647526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : secret "router-metrics-certs-default" not found Apr 16 22:16:01.471590 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:01.471545 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:01.471737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:01.471845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:01.471872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.471745 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.471927 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.471944 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fb78ff8db-pl67h: secret "image-registry-tls" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.471830 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.471970 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.471951872 +0000 UTC m=+137.017693297 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : secret "router-metrics-certs-default" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.472008 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls podName:d1c8a303-1d3d-4a9d-b1b2-9f6166516cad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.471997492 +0000 UTC m=+137.017738901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls") pod "image-registry-7fb78ff8db-pl67h" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad") : secret "image-registry-tls" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.472021 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls podName:5a0b0ab0-57d0-4640-93b0-5859a8af3aa5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.47201348 +0000 UTC m=+137.017754887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p52zz" (UID: "5a0b0ab0-57d0-4640-93b0-5859a8af3aa5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:01.472036 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:01.472034 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:03.47202731 +0000 UTC m=+137.017768715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:02.399681 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:02.399645 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" event={"ID":"5450699d-c1e9-4234-a7b9-c9440f986830","Type":"ContainerStarted","Data":"b24386b4468c56b9790deb998489abd5ddfe2ab9be3985d1262e2fe2dbe6b394"} Apr 16 22:16:02.416919 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:02.416870 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" podStartSLOduration=1.755663091 podStartE2EDuration="3.416857646s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:00.172387211 +0000 UTC m=+133.718128617" lastFinishedPulling="2026-04-16 22:16:01.833581764 +0000 UTC m=+135.379323172" observedRunningTime="2026-04-16 22:16:02.415577949 +0000 UTC m=+135.961319379" watchObservedRunningTime="2026-04-16 22:16:02.416857646 +0000 UTC m=+135.962599069" Apr 16 22:16:03.490396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:03.490357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:03.490396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:03.490397 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490512 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.490499242 +0000 UTC m=+141.036240646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490512 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490535 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fb78ff8db-pl67h: secret "image-registry-tls" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:03.490563 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490577 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls podName:d1c8a303-1d3d-4a9d-b1b2-9f6166516cad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.490563479 +0000 UTC m=+141.036304887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls") pod "image-registry-7fb78ff8db-pl67h" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad") : secret "image-registry-tls" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490623 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:03.490634 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490661 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.490649491 +0000 UTC m=+141.036390897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : secret "router-metrics-certs-default" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490696 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:03.490938 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:03.490726 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls podName:5a0b0ab0-57d0-4640-93b0-5859a8af3aa5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:07.49071897 +0000 UTC m=+141.036460379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p52zz" (UID: "5a0b0ab0-57d0-4640-93b0-5859a8af3aa5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:05.794357 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.794317 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nx6zw"] Apr 16 22:16:05.797262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.797242 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:05.803305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.803242 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 22:16:05.803565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.803548 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mm7sd\"" Apr 16 22:16:05.804091 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.804069 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 22:16:05.804576 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.804559 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 22:16:05.804870 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.804853 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 22:16:05.815161 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.815136 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nx6zw"] Apr 16 22:16:05.908055 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.908026 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4np\" (UniqueName: \"kubernetes.io/projected/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-kube-api-access-bx4np\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:05.908055 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.908065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-cabundle\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:05.908252 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:05.908092 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-key\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.009130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.009092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4np\" (UniqueName: \"kubernetes.io/projected/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-kube-api-access-bx4np\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.009130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.009137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-cabundle\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.009351 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.009170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-key\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.009908 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.009881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-cabundle\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.011678 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.011660 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-signing-key\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.017556 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.017530 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4np\" (UniqueName: \"kubernetes.io/projected/5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3-kube-api-access-bx4np\") pod \"service-ca-865cb79987-nx6zw\" (UID: \"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3\") " pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.106286 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.106207 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nx6zw" Apr 16 22:16:06.219181 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.219150 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nx6zw"] Apr 16 22:16:06.222497 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:06.222468 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3fe8d1_2fcb_4ded_be72_b4b2611b8db3.slice/crio-32cfd0526a905503247a2605767e91bdad3c49caa2d47f8788de23ba8dcb6bd3 WatchSource:0}: Error finding container 32cfd0526a905503247a2605767e91bdad3c49caa2d47f8788de23ba8dcb6bd3: Status 404 returned error can't find the container with id 32cfd0526a905503247a2605767e91bdad3c49caa2d47f8788de23ba8dcb6bd3 Apr 16 22:16:06.409128 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.409047 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nx6zw" event={"ID":"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3","Type":"ContainerStarted","Data":"d9a23622a51817d9eb2f3a24c9296d95607c4eb08692aaac5f29eec7b6e58d81"} Apr 16 22:16:06.409128 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.409082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nx6zw" event={"ID":"5c3fe8d1-2fcb-4ded-be72-b4b2611b8db3","Type":"ContainerStarted","Data":"32cfd0526a905503247a2605767e91bdad3c49caa2d47f8788de23ba8dcb6bd3"} Apr 16 22:16:06.440732 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.440680 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nx6zw" podStartSLOduration=1.4406643049999999 podStartE2EDuration="1.440664305s" podCreationTimestamp="2026-04-16 22:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:06.439128967 +0000 UTC m=+139.984870394" watchObservedRunningTime="2026-04-16 22:16:06.440664305 +0000 UTC m=+139.986405993" Apr 16 22:16:06.491318 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:06.491293 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tsbd9_969daa2e-581d-4248-8104-2e50544de6b9/dns-node-resolver/0.log" Apr 16 22:16:07.286275 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:07.286242 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wpfh2_2327dcdf-f40d-43bf-905a-1404d6e339f7/node-ca/0.log" Apr 16 22:16:07.524228 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:07.524185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:07.524444 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:07.524246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:07.524444 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:07.524322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:07.524444 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:07.524352 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:07.524444 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524367 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:07.524444 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524384 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7fb78ff8db-pl67h: secret "image-registry-tls" not found Apr 16 22:16:07.524689 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524462 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls podName:d1c8a303-1d3d-4a9d-b1b2-9f6166516cad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.524439549 +0000 UTC m=+149.070180966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls") pod "image-registry-7fb78ff8db-pl67h" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad") : secret "image-registry-tls" not found Apr 16 22:16:07.524689 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524503 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:07.524689 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524540 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls podName:5a0b0ab0-57d0-4640-93b0-5859a8af3aa5 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.524530515 +0000 UTC m=+149.070271921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p52zz" (UID: "5a0b0ab0-57d0-4640-93b0-5859a8af3aa5") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:16:07.524931 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524911 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.52489443 +0000 UTC m=+149.070635837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:07.524995 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.524986 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:07.525049 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:07.525023 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs podName:f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:15.525011322 +0000 UTC m=+149.070752729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs") pod "router-default-68f7bcf4cd-4j647" (UID: "f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07") : secret "router-metrics-certs-default" not found Apr 16 22:16:15.590911 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.590873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:15.590911 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.590915 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:15.591433 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.590950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:15.591433 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.590970 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:15.591763 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.591733 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-service-ca-bundle\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:15.593439 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.593397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a0b0ab0-57d0-4640-93b0-5859a8af3aa5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p52zz\" (UID: \"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:15.593525 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.593470 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"image-registry-7fb78ff8db-pl67h\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:15.593965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.593941 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07-metrics-certs\") pod \"router-default-68f7bcf4cd-4j647\" (UID: \"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07\") " pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:15.657765 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.657729 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" Apr 16 22:16:15.666513 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.666487 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:15.673358 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.673329 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:15.798769 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:15.798662 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz"] Apr 16 22:16:15.801334 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:15.801287 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0b0ab0_57d0_4640_93b0_5859a8af3aa5.slice/crio-be7737507e8cec20e6114443b44e27cc0043094086587bd6c855f8e54458bc5a WatchSource:0}: Error finding container be7737507e8cec20e6114443b44e27cc0043094086587bd6c855f8e54458bc5a: Status 404 returned error can't find the container with id be7737507e8cec20e6114443b44e27cc0043094086587bd6c855f8e54458bc5a Apr 16 22:16:16.026236 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.026206 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:16:16.029597 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.029544 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68f7bcf4cd-4j647"] Apr 16 22:16:16.030371 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:16.030343 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c8a303_1d3d_4a9d_b1b2_9f6166516cad.slice/crio-cd8b56895985f067c46df81495394127186bb48cd37697cfe0a260961bfec331 WatchSource:0}: Error finding container cd8b56895985f067c46df81495394127186bb48cd37697cfe0a260961bfec331: Status 404 returned error can't find the container with id cd8b56895985f067c46df81495394127186bb48cd37697cfe0a260961bfec331 Apr 16 22:16:16.033113 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:16.033086 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f1b4a3_4e97_42c1_8e7a_224c24e6fa07.slice/crio-5acb9d69a482758a6184c29d8cca1b2005a7e6cbbd7aa03895f7de161f204660 WatchSource:0}: Error finding container 5acb9d69a482758a6184c29d8cca1b2005a7e6cbbd7aa03895f7de161f204660: Status 404 returned error can't find the container with id 5acb9d69a482758a6184c29d8cca1b2005a7e6cbbd7aa03895f7de161f204660 Apr 16 22:16:16.432876 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.432809 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" event={"ID":"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad","Type":"ContainerStarted","Data":"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e"} Apr 16 22:16:16.432876 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.432851 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" event={"ID":"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad","Type":"ContainerStarted","Data":"cd8b56895985f067c46df81495394127186bb48cd37697cfe0a260961bfec331"} Apr 16 22:16:16.433132 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.432950 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:16.434480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.434395 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" event={"ID":"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07","Type":"ContainerStarted","Data":"f50981c6c883521969e6857b6e484f6bdbb81fc22d3c78d3e29427ea754b1085"} Apr 16 22:16:16.434480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.434445 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" event={"ID":"f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07","Type":"ContainerStarted","Data":"5acb9d69a482758a6184c29d8cca1b2005a7e6cbbd7aa03895f7de161f204660"} Apr 16 22:16:16.435605 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.435580 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" event={"ID":"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5","Type":"ContainerStarted","Data":"be7737507e8cec20e6114443b44e27cc0043094086587bd6c855f8e54458bc5a"} Apr 16 22:16:16.453142 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.453089 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" podStartSLOduration=17.453073 podStartE2EDuration="17.453073s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:16.453012479 +0000 UTC m=+149.998753907" watchObservedRunningTime="2026-04-16 22:16:16.453073 +0000 UTC m=+149.998814432" Apr 16 22:16:16.473985 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.473937 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" podStartSLOduration=17.473919144 podStartE2EDuration="17.473919144s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:16.473029386 +0000 UTC m=+150.018770814" watchObservedRunningTime="2026-04-16 22:16:16.473919144 +0000 UTC m=+150.019660568" Apr 16 22:16:16.667478 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.667435 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:16.670509 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:16.670482 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:17.438062 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:17.437968 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:17.439513 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:17.439492 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68f7bcf4cd-4j647" Apr 16 22:16:18.441888 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:18.441846 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" event={"ID":"5a0b0ab0-57d0-4640-93b0-5859a8af3aa5","Type":"ContainerStarted","Data":"7e039b0756c147279ad7f8ae0bba32916076f212361812483dbbab2425168ed8"} Apr 16 22:16:18.460944 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:18.460879 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p52zz" podStartSLOduration=17.866359673 podStartE2EDuration="19.460859226s" podCreationTimestamp="2026-04-16 22:15:59 +0000 UTC" firstStartedPulling="2026-04-16 22:16:15.803639742 +0000 UTC m=+149.349381146" lastFinishedPulling="2026-04-16 22:16:17.39813928 +0000 UTC m=+150.943880699" observedRunningTime="2026-04-16 22:16:18.459567437 +0000 UTC m=+152.005308857" watchObservedRunningTime="2026-04-16 22:16:18.460859226 +0000 UTC m=+152.006600655" Apr 16 22:16:22.393638 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:22.393590 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-86xzj" podUID="94b66058-cae9-48ec-b576-71611c7b606e" Apr 16 22:16:22.407809 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:22.407768 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gxxg4" podUID="e4a447af-7f68-4189-bc97-af653fe8ba76" Apr 16 22:16:22.450788 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:22.450762 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:24.026065 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:24.026021 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qz5vc" podUID="e06e94b1-2063-48f2-b8a7-0d0e4193f064" Apr 16 22:16:25.742573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.742540 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tdvzz"] Apr 16 22:16:25.750896 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.750873 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.756444 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.756397 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:25.756668 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.756462 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:16:25.757085 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.757046 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:25.757382 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.757359 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hc5p5\"" Apr 16 22:16:25.757573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.757372 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:16:25.757779 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.757761 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:16:25.765540 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.765517 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tdvzz"] Apr 16 22:16:25.873591 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.873558 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-crio-socket\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.873731 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.873632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.873731 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.873662 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghld\" (UniqueName: \"kubernetes.io/projected/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-api-access-7ghld\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.873818 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.873766 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.873849 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.873828 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-data-volume\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.974789 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.974946 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974811 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-data-volume\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.974946 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974835 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-crio-socket\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.974946 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.974946 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghld\" (UniqueName: \"kubernetes.io/projected/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-api-access-7ghld\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.975185 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.974979 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-crio-socket\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.975249 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.975196 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-data-volume\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.975335 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.975315 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.977186 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.977164 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:25.984048 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:25.984023 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghld\" (UniqueName: \"kubernetes.io/projected/0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a-kube-api-access-7ghld\") pod \"insights-runtime-extractor-tdvzz\" (UID: \"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a\") " pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:26.102603 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.102516 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tdvzz" Apr 16 22:16:26.223600 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.223569 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tdvzz"] Apr 16 22:16:26.226789 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:26.226762 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de9bb94_5ef7_41cb_8fa8_5ddbc1aafa5a.slice/crio-249e386895430e24d736909fcbe11c8f81fe6e04e37c860fef2cd07b6ad5fd59 WatchSource:0}: Error finding container 249e386895430e24d736909fcbe11c8f81fe6e04e37c860fef2cd07b6ad5fd59: Status 404 returned error can't find the container with id 249e386895430e24d736909fcbe11c8f81fe6e04e37c860fef2cd07b6ad5fd59 Apr 16 22:16:26.461975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.461940 2565 generic.go:358] "Generic (PLEG): container finished" podID="b3841cbb-cef3-4f92-8234-40baf97239eb" containerID="ae7110c4e651a570ae44cdf928c810e5c4e928477b7107ef3b5ee3b11998029f" exitCode=255 Apr 16 22:16:26.462177 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.462016 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" event={"ID":"b3841cbb-cef3-4f92-8234-40baf97239eb","Type":"ContainerDied","Data":"ae7110c4e651a570ae44cdf928c810e5c4e928477b7107ef3b5ee3b11998029f"} Apr 16 22:16:26.463303 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.463279 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tdvzz" event={"ID":"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a","Type":"ContainerStarted","Data":"fdd5caaa16bf5ad9ce8f6f3183f2204c37864729e58f9c5600a3691ee1c852dd"} Apr 16 22:16:26.463303 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.463305 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tdvzz" event={"ID":"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a","Type":"ContainerStarted","Data":"249e386895430e24d736909fcbe11c8f81fe6e04e37c860fef2cd07b6ad5fd59"} Apr 16 22:16:26.467714 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:26.467697 2565 scope.go:117] "RemoveContainer" containerID="ae7110c4e651a570ae44cdf928c810e5c4e928477b7107ef3b5ee3b11998029f" Apr 16 22:16:27.387265 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.387179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:16:27.387671 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.387268 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:27.389940 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.389910 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4a447af-7f68-4189-bc97-af653fe8ba76-cert\") pod \"ingress-canary-gxxg4\" (UID: \"e4a447af-7f68-4189-bc97-af653fe8ba76\") " pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:16:27.390081 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.390061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94b66058-cae9-48ec-b576-71611c7b606e-metrics-tls\") pod \"dns-default-86xzj\" (UID: \"94b66058-cae9-48ec-b576-71611c7b606e\") " pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:27.467879 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.467836 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7cf5c5998f-qdfxb" event={"ID":"b3841cbb-cef3-4f92-8234-40baf97239eb","Type":"ContainerStarted","Data":"101c99b0c2398403afcee733f5fcb42519210ac27039ba2e7c5161eb0618131d"} Apr 16 22:16:27.470358 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.470326 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tdvzz" event={"ID":"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a","Type":"ContainerStarted","Data":"715a9f53456a703d2bbcecb0b144b36117170ec7d75c8d70c49c2cb8e628e1ca"} Apr 16 22:16:27.553345 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.553312 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-66k94\"" Apr 16 22:16:27.562257 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.562229 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:27.700165 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:27.700132 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-86xzj"] Apr 16 22:16:27.703842 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:27.703814 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b66058_cae9_48ec_b576_71611c7b606e.slice/crio-9aad409caff7ad0e24a9ba100e854ddfc21f16fd75334b0c43b8ead43e392604 WatchSource:0}: Error finding container 9aad409caff7ad0e24a9ba100e854ddfc21f16fd75334b0c43b8ead43e392604: Status 404 returned error can't find the container with id 9aad409caff7ad0e24a9ba100e854ddfc21f16fd75334b0c43b8ead43e392604 Apr 16 22:16:28.474270 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.474225 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86xzj" event={"ID":"94b66058-cae9-48ec-b576-71611c7b606e","Type":"ContainerStarted","Data":"9aad409caff7ad0e24a9ba100e854ddfc21f16fd75334b0c43b8ead43e392604"} Apr 16 22:16:28.476288 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.476255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tdvzz" event={"ID":"0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a","Type":"ContainerStarted","Data":"2d5548d7a01ec44f95013fa2e8ed868d6ae7a4a5df92cd7c1b5ae3c25aa1c023"} Apr 16 22:16:28.493982 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.493880 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tdvzz" podStartSLOduration=1.5143640870000001 podStartE2EDuration="3.493865591s" podCreationTimestamp="2026-04-16 22:16:25 +0000 UTC" firstStartedPulling="2026-04-16 22:16:26.286150703 +0000 UTC m=+159.831892108" lastFinishedPulling="2026-04-16 22:16:28.265652208 +0000 UTC m=+161.811393612" observedRunningTime="2026-04-16 22:16:28.493516987 +0000 UTC m=+162.039258417" watchObservedRunningTime="2026-04-16 22:16:28.493865591 +0000 UTC m=+162.039607018" Apr 16 22:16:28.971373 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.971338 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-56clp"] Apr 16 22:16:28.974847 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.974793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:28.978861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.978836 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 22:16:28.978976 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.978872 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 22:16:28.979058 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.979029 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:28.979162 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.979148 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7fcvt\"" Apr 16 22:16:28.989498 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:28.989466 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-56clp"] Apr 16 22:16:29.102564 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.102530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.102750 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.102602 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9276l\" (UniqueName: \"kubernetes.io/projected/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-kube-api-access-9276l\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.102750 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.102644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.102750 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.102672 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.203714 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.203673 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.203917 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.203763 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.203917 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.203824 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9276l\" (UniqueName: \"kubernetes.io/projected/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-kube-api-access-9276l\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.203917 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.203863 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.204807 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.204785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.206886 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.206856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.207175 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.207153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.216186 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.216156 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9276l\" (UniqueName: \"kubernetes.io/projected/e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a-kube-api-access-9276l\") pod \"prometheus-operator-5676c8c784-56clp\" (UID: \"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.286794 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.286765 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" Apr 16 22:16:29.440473 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.440450 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-56clp"] Apr 16 22:16:29.443049 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:29.443021 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dc2dad_9fd4_4d53_a31b_c7979e4fd34a.slice/crio-f477535fc33b7f817305ce465c7d1e566e93c0023cb6f3d63af1d69d82600fae WatchSource:0}: Error finding container f477535fc33b7f817305ce465c7d1e566e93c0023cb6f3d63af1d69d82600fae: Status 404 returned error can't find the container with id f477535fc33b7f817305ce465c7d1e566e93c0023cb6f3d63af1d69d82600fae Apr 16 22:16:29.479820 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.479785 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" event={"ID":"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a","Type":"ContainerStarted","Data":"f477535fc33b7f817305ce465c7d1e566e93c0023cb6f3d63af1d69d82600fae"} Apr 16 22:16:29.481390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.481359 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86xzj" event={"ID":"94b66058-cae9-48ec-b576-71611c7b606e","Type":"ContainerStarted","Data":"779ffc3ac9e455c83208561fe548318a2af64dd4de4e21f210fba4e8a2bbc673"} Apr 16 22:16:29.481390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.481392 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86xzj" event={"ID":"94b66058-cae9-48ec-b576-71611c7b606e","Type":"ContainerStarted","Data":"5db719cc9f738a828b345e4a590247fc3c0e3fdd0e6bf95ec7d6615fcfe6e1cf"} Apr 16 22:16:29.481651 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.481632 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:29.498767 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:29.498691 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-86xzj" podStartSLOduration=129.04467009 podStartE2EDuration="2m10.498677725s" podCreationTimestamp="2026-04-16 22:14:19 +0000 UTC" firstStartedPulling="2026-04-16 22:16:27.706052537 +0000 UTC m=+161.251793949" lastFinishedPulling="2026-04-16 22:16:29.160060179 +0000 UTC m=+162.705801584" observedRunningTime="2026-04-16 22:16:29.498167934 +0000 UTC m=+163.043909362" watchObservedRunningTime="2026-04-16 22:16:29.498677725 +0000 UTC m=+163.044419190" Apr 16 22:16:31.488725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:31.488692 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" event={"ID":"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a","Type":"ContainerStarted","Data":"82d414fddced51f024ef19ac81392416d897f97d3450e6f711bb9effb7f0bd1b"} Apr 16 22:16:31.488725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:31.488726 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" event={"ID":"e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a","Type":"ContainerStarted","Data":"e10d94fa9fb07ace373d6861739633d3321133b6d5673af5257520626ae7a191"} Apr 16 22:16:31.504652 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:31.504611 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-56clp" podStartSLOduration=2.245309401 podStartE2EDuration="3.504596728s" podCreationTimestamp="2026-04-16 22:16:28 +0000 UTC" firstStartedPulling="2026-04-16 22:16:29.444868057 +0000 UTC m=+162.990609465" lastFinishedPulling="2026-04-16 22:16:30.704155376 +0000 UTC m=+164.249896792" observedRunningTime="2026-04-16 22:16:31.504023724 +0000 UTC m=+165.049765152" watchObservedRunningTime="2026-04-16 22:16:31.504596728 +0000 UTC m=+165.050338173" Apr 16 22:16:33.370428 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.370379 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvw5s"] Apr 16 22:16:33.374026 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.373994 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.376259 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.376237 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:33.376375 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.376321 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ntpqd"] Apr 16 22:16:33.376627 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.376611 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 22:16:33.377048 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.377014 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 22:16:33.377762 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.377740 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-lwjsn\"" Apr 16 22:16:33.379759 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.379740 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.382141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.382121 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:33.382319 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.382301 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:33.382458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.382434 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mfcw8\"" Apr 16 22:16:33.382571 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.382445 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:33.397454 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.397400 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvw5s"] Apr 16 22:16:33.439749 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439721 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-sys\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.439749 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439753 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-textfile\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439784 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439826 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439884 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-root\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439919 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpxm\" (UniqueName: \"kubernetes.io/projected/062d7229-a013-4380-ab02-82ecd7a903da-kube-api-access-gtpxm\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439937 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-wtmp\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439957 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbt55\" (UniqueName: \"kubernetes.io/projected/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-api-access-cbt55\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.439982 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440046 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.440088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440094 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-metrics-client-ca\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440137 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.440458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440154 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-tls\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.440458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440177 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.440458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.440216 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36aa52bd-69d4-437f-9e73-2f05a7ae660e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.541396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-tls\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541400 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541445 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36aa52bd-69d4-437f-9e73-2f05a7ae660e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541477 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-sys\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541500 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-textfile\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-sys\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541663 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541705 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-root\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.541744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541741 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpxm\" (UniqueName: \"kubernetes.io/projected/062d7229-a013-4380-ab02-82ecd7a903da-kube-api-access-gtpxm\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-wtmp\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541801 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbt55\" (UniqueName: \"kubernetes.io/projected/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-api-access-cbt55\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:33.541811 2565 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541826 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541856 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:33.541869 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls podName:36aa52bd-69d4-437f-9e73-2f05a7ae660e nodeName:}" failed. No retries permitted until 2026-04-16 22:16:34.041850981 +0000 UTC m=+167.587592402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-xvw5s" (UID: "36aa52bd-69d4-437f-9e73-2f05a7ae660e") : secret "kube-state-metrics-tls" not found Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-metrics-client-ca\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541964 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542004 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-textfile\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542144 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-wtmp\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542652 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542633 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/062d7229-a013-4380-ab02-82ecd7a903da-root\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542652 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542642 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-metrics-client-ca\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.542751 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.541895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36aa52bd-69d4-437f-9e73-2f05a7ae660e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.542751 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.542907 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.542872 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.543275 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.543249 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-accelerators-collector-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.544266 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.544243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-tls\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.544379 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.544361 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.544955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.544936 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/062d7229-a013-4380-ab02-82ecd7a903da-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.551479 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.551458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbt55\" (UniqueName: \"kubernetes.io/projected/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-api-access-cbt55\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:33.552239 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.552220 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpxm\" (UniqueName: \"kubernetes.io/projected/062d7229-a013-4380-ab02-82ecd7a903da-kube-api-access-gtpxm\") pod \"node-exporter-ntpqd\" (UID: \"062d7229-a013-4380-ab02-82ecd7a903da\") " pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.691454 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:33.691423 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ntpqd" Apr 16 22:16:33.699632 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:33.699603 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod062d7229_a013_4380_ab02_82ecd7a903da.slice/crio-e937f6a1c88b7b8e6836371c031ba58646c722af3ee352ee4df6ccd4fbb576b4 WatchSource:0}: Error finding container e937f6a1c88b7b8e6836371c031ba58646c722af3ee352ee4df6ccd4fbb576b4: Status 404 returned error can't find the container with id e937f6a1c88b7b8e6836371c031ba58646c722af3ee352ee4df6ccd4fbb576b4 Apr 16 22:16:34.013856 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.013744 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:16:34.016464 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.016438 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-b4hjg\"" Apr 16 22:16:34.024960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.024935 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxxg4" Apr 16 22:16:34.046096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.046068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:34.048393 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.048368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36aa52bd-69d4-437f-9e73-2f05a7ae660e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xvw5s\" (UID: \"36aa52bd-69d4-437f-9e73-2f05a7ae660e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:34.160524 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.160477 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxxg4"] Apr 16 22:16:34.163636 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:34.163605 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a447af_7f68_4189_bc97_af653fe8ba76.slice/crio-1c6d90926736d97582c81736b677604f791bb63b20b3d465bb564a41c60ea947 WatchSource:0}: Error finding container 1c6d90926736d97582c81736b677604f791bb63b20b3d465bb564a41c60ea947: Status 404 returned error can't find the container with id 1c6d90926736d97582c81736b677604f791bb63b20b3d465bb564a41c60ea947 Apr 16 22:16:34.286326 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.285450 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" Apr 16 22:16:34.450391 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.449696 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:34.453962 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.453893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.456982 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457245 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-89h67\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457304 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457629 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457640 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457865 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.457874 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.458077 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.458252 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:16:34.458526 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.458347 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:16:34.468519 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.468492 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:34.509039 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.509004 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntpqd" event={"ID":"062d7229-a013-4380-ab02-82ecd7a903da","Type":"ContainerStarted","Data":"e937f6a1c88b7b8e6836371c031ba58646c722af3ee352ee4df6ccd4fbb576b4"} Apr 16 22:16:34.513609 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.513575 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxxg4" event={"ID":"e4a447af-7f68-4189-bc97-af653fe8ba76","Type":"ContainerStarted","Data":"1c6d90926736d97582c81736b677604f791bb63b20b3d465bb564a41c60ea947"} Apr 16 22:16:34.541376 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.541347 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xvw5s"] Apr 16 22:16:34.547788 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:34.547752 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36aa52bd_69d4_437f_9e73_2f05a7ae660e.slice/crio-c8f3374fbd499675b244b5ed4040af8ac98a5f10f31b48b35d7d998c200bfffa WatchSource:0}: Error finding container c8f3374fbd499675b244b5ed4040af8ac98a5f10f31b48b35d7d998c200bfffa: Status 404 returned error can't find the container with id c8f3374fbd499675b244b5ed4040af8ac98a5f10f31b48b35d7d998c200bfffa Apr 16 22:16:34.550511 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550485 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550594 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550544 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550594 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550571 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550686 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550599 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550686 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550658 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550759 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550759 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550733 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550776 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk7gl\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550949 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550887 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.550949 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.550917 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.551044 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.551020 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.551106 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.551070 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652589 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652636 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk7gl\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652710 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652764 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652801 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652932 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.652960 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.654014 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.652985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.654014 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.653018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.654014 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:34.653171 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle podName:4e2b154a-ff32-4fb3-b16d-9f66a5c14404 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:35.153148128 +0000 UTC m=+168.698889540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404") : configmap references non-existent config key: ca-bundle.crt Apr 16 22:16:34.654170 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.654139 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.654726 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.654429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.656169 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.656144 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.656714 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.656688 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.657817 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.657473 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.657943 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.657920 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.658635 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.658530 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.659245 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.659197 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.659373 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.659354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.659925 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.659889 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.659995 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.659973 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:34.661505 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:34.661459 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk7gl\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:35.158792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.158752 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:35.159975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.159915 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:35.367958 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.367883 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:35.517740 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.517690 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" event={"ID":"36aa52bd-69d4-437f-9e73-2f05a7ae660e","Type":"ContainerStarted","Data":"c8f3374fbd499675b244b5ed4040af8ac98a5f10f31b48b35d7d998c200bfffa"} Apr 16 22:16:35.519301 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.519267 2565 generic.go:358] "Generic (PLEG): container finished" podID="062d7229-a013-4380-ab02-82ecd7a903da" containerID="f796214a9be101b16d49d9ea123ff0083ad6026efe28ab5dabf560bdeb0aff1e" exitCode=0 Apr 16 22:16:35.519441 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.519309 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntpqd" event={"ID":"062d7229-a013-4380-ab02-82ecd7a903da","Type":"ContainerDied","Data":"f796214a9be101b16d49d9ea123ff0083ad6026efe28ab5dabf560bdeb0aff1e"} Apr 16 22:16:35.762206 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.762166 2565 patch_prober.go:28] interesting pod/image-registry-7fb78ff8db-pl67h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:35.762360 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:35.762249 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:36.312645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.309229 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:36.317882 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:36.317851 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2b154a_ff32_4fb3_b16d_9f66a5c14404.slice/crio-722c1ae03e127877b5b343e0f85ae25dcaa965a6e4d0ab3cf91cca8b8505e000 WatchSource:0}: Error finding container 722c1ae03e127877b5b343e0f85ae25dcaa965a6e4d0ab3cf91cca8b8505e000: Status 404 returned error can't find the container with id 722c1ae03e127877b5b343e0f85ae25dcaa965a6e4d0ab3cf91cca8b8505e000 Apr 16 22:16:36.441759 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.440656 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-85f799bc7d-kk5vv"] Apr 16 22:16:36.444825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.444790 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.447653 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.447618 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 22:16:36.447865 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.447847 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 22:16:36.448063 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.448048 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6gpmjidkg0u4\"" Apr 16 22:16:36.449851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.449114 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 22:16:36.449851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.449385 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-nvm9r\"" Apr 16 22:16:36.449851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.449607 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 22:16:36.450294 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.450125 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 22:16:36.457284 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.457259 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85f799bc7d-kk5vv"] Apr 16 22:16:36.525390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.524541 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" event={"ID":"36aa52bd-69d4-437f-9e73-2f05a7ae660e","Type":"ContainerStarted","Data":"3e88eae11a6a1849c8677e9a678dde83d345549618c887770730f9abbed87eef"} Apr 16 22:16:36.525390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.524580 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" event={"ID":"36aa52bd-69d4-437f-9e73-2f05a7ae660e","Type":"ContainerStarted","Data":"a2fe07e8479d94942351447f3797c41987633f766cc28d87c87a8ef0db205260"} Apr 16 22:16:36.525390 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.524597 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" event={"ID":"36aa52bd-69d4-437f-9e73-2f05a7ae660e","Type":"ContainerStarted","Data":"0b89bbdfab0466831f1fd39fb89b3ae6c9ec8705537621db594499cff278d8b9"} Apr 16 22:16:36.528607 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.527894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntpqd" event={"ID":"062d7229-a013-4380-ab02-82ecd7a903da","Type":"ContainerStarted","Data":"aef89591c2225de08ecc4c4f851099779ae3a51bb40aab56c4c9a377f574bd65"} Apr 16 22:16:36.528607 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.527924 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ntpqd" event={"ID":"062d7229-a013-4380-ab02-82ecd7a903da","Type":"ContainerStarted","Data":"35c653b869b1013004b939925f6d86244b6117ea5729f4df9b9e6957bbe4ef44"} Apr 16 22:16:36.530540 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.530516 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxxg4" event={"ID":"e4a447af-7f68-4189-bc97-af653fe8ba76","Type":"ContainerStarted","Data":"6b297ddca71134e5f2bbbeedf1623655ca53dd4e66abd5a1bfaae0b81a2d8e1e"} Apr 16 22:16:36.531932 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.531913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"722c1ae03e127877b5b343e0f85ae25dcaa965a6e4d0ab3cf91cca8b8505e000"} Apr 16 22:16:36.544116 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.543893 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xvw5s" podStartSLOduration=1.937972501 podStartE2EDuration="3.543875314s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:34.550068171 +0000 UTC m=+168.095809581" lastFinishedPulling="2026-04-16 22:16:36.155970988 +0000 UTC m=+169.701712394" observedRunningTime="2026-04-16 22:16:36.543476525 +0000 UTC m=+170.089217950" watchObservedRunningTime="2026-04-16 22:16:36.543875314 +0000 UTC m=+170.089616742" Apr 16 22:16:36.561565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.561509 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ntpqd" podStartSLOduration=2.812893312 podStartE2EDuration="3.56149234s" podCreationTimestamp="2026-04-16 22:16:33 +0000 UTC" firstStartedPulling="2026-04-16 22:16:33.701261379 +0000 UTC m=+167.247002785" lastFinishedPulling="2026-04-16 22:16:34.449860394 +0000 UTC m=+167.995601813" observedRunningTime="2026-04-16 22:16:36.55986468 +0000 UTC m=+170.105606106" watchObservedRunningTime="2026-04-16 22:16:36.56149234 +0000 UTC m=+170.107233767" Apr 16 22:16:36.571435 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571379 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571587 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571473 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f43fe85-9d27-4e45-b966-2c0ce4388b28-metrics-client-ca\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571587 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571517 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571587 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571569 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljj79\" (UniqueName: \"kubernetes.io/projected/3f43fe85-9d27-4e45-b966-2c0ce4388b28-kube-api-access-ljj79\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571730 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571730 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-grpc-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571730 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571678 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.571730 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.571698 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.577507 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.577447 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gxxg4" podStartSLOduration=135.588859252 podStartE2EDuration="2m17.577431336s" podCreationTimestamp="2026-04-16 22:14:19 +0000 UTC" firstStartedPulling="2026-04-16 22:16:34.165739102 +0000 UTC m=+167.711480513" lastFinishedPulling="2026-04-16 22:16:36.154311172 +0000 UTC m=+169.700052597" observedRunningTime="2026-04-16 22:16:36.576617037 +0000 UTC m=+170.122358460" watchObservedRunningTime="2026-04-16 22:16:36.577431336 +0000 UTC m=+170.123172763" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.672909 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673034 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f43fe85-9d27-4e45-b966-2c0ce4388b28-metrics-client-ca\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljj79\" (UniqueName: \"kubernetes.io/projected/3f43fe85-9d27-4e45-b966-2c0ce4388b28-kube-api-access-ljj79\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673270 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673336 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-grpc-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673369 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.673546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.673472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.675027 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.674995 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f43fe85-9d27-4e45-b966-2c0ce4388b28-metrics-client-ca\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.676973 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.676945 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.677385 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.677355 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.677515 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.677443 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.677754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.677734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.678155 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.678128 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.678657 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.678636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f43fe85-9d27-4e45-b966-2c0ce4388b28-secret-grpc-tls\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.681561 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.681536 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljj79\" (UniqueName: \"kubernetes.io/projected/3f43fe85-9d27-4e45-b966-2c0ce4388b28-kube-api-access-ljj79\") pod \"thanos-querier-85f799bc7d-kk5vv\" (UID: \"3f43fe85-9d27-4e45-b966-2c0ce4388b28\") " pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.764472 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.764435 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:36.911773 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:36.911742 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85f799bc7d-kk5vv"] Apr 16 22:16:36.914462 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:36.914390 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f43fe85_9d27_4e45_b966_2c0ce4388b28.slice/crio-b9af7aab6d62dfba8c2d10719c0d11b1dbcc3d5c80fb19b378eab0ea38cd66a5 WatchSource:0}: Error finding container b9af7aab6d62dfba8c2d10719c0d11b1dbcc3d5c80fb19b378eab0ea38cd66a5: Status 404 returned error can't find the container with id b9af7aab6d62dfba8c2d10719c0d11b1dbcc3d5c80fb19b378eab0ea38cd66a5 Apr 16 22:16:37.536389 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:37.536350 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848" exitCode=0 Apr 16 22:16:37.536971 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:37.536444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848"} Apr 16 22:16:37.538312 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:37.538269 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"b9af7aab6d62dfba8c2d10719c0d11b1dbcc3d5c80fb19b378eab0ea38cd66a5"} Apr 16 22:16:38.114528 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.114498 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5"] Apr 16 22:16:38.117774 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.117754 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:38.120029 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.120004 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-brskw\"" Apr 16 22:16:38.120227 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.120214 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 22:16:38.126610 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.126590 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5"] Apr 16 22:16:38.195040 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.195000 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kf5c5\" (UID: \"d72fa396-b4b8-4dc3-995e-8f06c575edb7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:38.296417 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.296372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kf5c5\" (UID: \"d72fa396-b4b8-4dc3-995e-8f06c575edb7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:38.296613 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:38.296540 2565 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 22:16:38.296721 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:38.296631 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert podName:d72fa396-b4b8-4dc3-995e-8f06c575edb7 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:38.796610376 +0000 UTC m=+172.342351783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-kf5c5" (UID: "d72fa396-b4b8-4dc3-995e-8f06c575edb7") : secret "monitoring-plugin-cert" not found Apr 16 22:16:38.801248 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.801206 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kf5c5\" (UID: \"d72fa396-b4b8-4dc3-995e-8f06c575edb7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:38.804260 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:38.804232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d72fa396-b4b8-4dc3-995e-8f06c575edb7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-kf5c5\" (UID: \"d72fa396-b4b8-4dc3-995e-8f06c575edb7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:39.013867 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.013829 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:16:39.027314 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.027282 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:39.353723 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.353643 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5"] Apr 16 22:16:39.357108 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:16:39.356091 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd72fa396_b4b8_4dc3_995e_8f06c575edb7.slice/crio-7359bab2650acd55212e53e5a3229ca34084d0493de98a1c9053280c86240024 WatchSource:0}: Error finding container 7359bab2650acd55212e53e5a3229ca34084d0493de98a1c9053280c86240024: Status 404 returned error can't find the container with id 7359bab2650acd55212e53e5a3229ca34084d0493de98a1c9053280c86240024 Apr 16 22:16:39.487092 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.487025 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-86xzj" Apr 16 22:16:39.553295 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.553255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"80a0446acb7a4005d6ef2332a12dc700ae8f02c8102e6eff0ab6404c6a44af33"} Apr 16 22:16:39.553475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.553302 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"ca7151ff7fb5025551fa3e410fb47465a7d3b90e77c7e124505e7c3cf76a0d0d"} Apr 16 22:16:39.553475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.553317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"81387ecb752b03a73f34a93c7a7a956ec15c8892c351c30104decc5c7e186669"} Apr 16 22:16:39.554873 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.554820 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" event={"ID":"d72fa396-b4b8-4dc3-995e-8f06c575edb7","Type":"ContainerStarted","Data":"7359bab2650acd55212e53e5a3229ca34084d0493de98a1c9053280c86240024"} Apr 16 22:16:39.557360 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.557340 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec"} Apr 16 22:16:39.557448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.557364 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572"} Apr 16 22:16:39.557448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.557372 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7"} Apr 16 22:16:39.557448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:39.557381 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034"} Apr 16 22:16:40.564063 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.564022 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd"} Apr 16 22:16:40.564063 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.564063 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerStarted","Data":"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9"} Apr 16 22:16:40.567146 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.567068 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"659391a7b22a348705a5418d8ed035aff48bfd98bbf813fd263279b13a700ede"} Apr 16 22:16:40.567146 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.567112 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"bf1295d6dd8348c0c45d41d5c3c6c397d07c595f6a7f722d1ce987c4470ac50c"} Apr 16 22:16:40.567146 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.567126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" event={"ID":"3f43fe85-9d27-4e45-b966-2c0ce4388b28","Type":"ContainerStarted","Data":"f379b9e92eae1c008da347f4cbe3ff05ec8fa26a2b70bc942cf5e89f3852981a"} Apr 16 22:16:40.567345 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.567304 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:40.619617 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.619516 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.816413809 podStartE2EDuration="6.61950071s" podCreationTimestamp="2026-04-16 22:16:34 +0000 UTC" firstStartedPulling="2026-04-16 22:16:36.321879718 +0000 UTC m=+169.867621128" lastFinishedPulling="2026-04-16 22:16:40.12496661 +0000 UTC m=+173.670708029" observedRunningTime="2026-04-16 22:16:40.616977364 +0000 UTC m=+174.162718794" watchObservedRunningTime="2026-04-16 22:16:40.61950071 +0000 UTC m=+174.165242137" Apr 16 22:16:40.653108 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:40.653050 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" podStartSLOduration=1.471042972 podStartE2EDuration="4.653031936s" podCreationTimestamp="2026-04-16 22:16:36 +0000 UTC" firstStartedPulling="2026-04-16 22:16:36.916654233 +0000 UTC m=+170.462395642" lastFinishedPulling="2026-04-16 22:16:40.098643187 +0000 UTC m=+173.644384606" observedRunningTime="2026-04-16 22:16:40.652088363 +0000 UTC m=+174.197829791" watchObservedRunningTime="2026-04-16 22:16:40.653031936 +0000 UTC m=+174.198773365" Apr 16 22:16:41.571009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:41.570972 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" event={"ID":"d72fa396-b4b8-4dc3-995e-8f06c575edb7","Type":"ContainerStarted","Data":"09e98de0efc8f0ff61fec60b90f10a91fc399d29abb753a7710c180e2ebf5016"} Apr 16 22:16:41.571713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:41.571693 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:41.575928 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:41.575907 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" Apr 16 22:16:41.588123 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:41.588078 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-kf5c5" podStartSLOduration=1.975907867 podStartE2EDuration="3.588065756s" podCreationTimestamp="2026-04-16 22:16:38 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.360317302 +0000 UTC m=+172.906058714" lastFinishedPulling="2026-04-16 22:16:40.972475183 +0000 UTC m=+174.518216603" observedRunningTime="2026-04-16 22:16:41.587542684 +0000 UTC m=+175.133284112" watchObservedRunningTime="2026-04-16 22:16:41.588065756 +0000 UTC m=+175.133807183" Apr 16 22:16:45.761426 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:45.761374 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:46.578168 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:46.578139 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-85f799bc7d-kk5vv" Apr 16 22:16:50.777890 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:50.777830 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerName="registry" containerID="cri-o://db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e" gracePeriod=30 Apr 16 22:16:51.014131 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.014108 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:51.111189 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111106 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111189 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111146 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111189 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111185 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shwpr\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111210 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111239 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111282 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111341 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111369 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted\") pod \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\" (UID: \"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad\") " Apr 16 22:16:51.111796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.111765 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:51.112044 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.112020 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:51.113705 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.113678 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:51.113847 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.113819 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:51.113939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.113923 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr" (OuterVolumeSpecName: "kube-api-access-shwpr") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "kube-api-access-shwpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:51.113992 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.113938 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:51.113992 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.113954 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:51.119803 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.119775 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" (UID: "d1c8a303-1d3d-4a9d-b1b2-9f6166516cad"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:51.212066 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212025 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shwpr\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-kube-api-access-shwpr\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212066 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212052 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-trusted-ca\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212066 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212065 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-certificates\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212079 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-installation-pull-secrets\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212091 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-bound-sa-token\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212102 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-ca-trust-extracted\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212114 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-image-registry-private-configuration\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.212283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.212127 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad-registry-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:16:51.601882 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.601838 2565 generic.go:358] "Generic (PLEG): container finished" podID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerID="db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e" exitCode=0 Apr 16 22:16:51.602074 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.601928 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" Apr 16 22:16:51.602074 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.601927 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" event={"ID":"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad","Type":"ContainerDied","Data":"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e"} Apr 16 22:16:51.602074 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.601975 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb78ff8db-pl67h" event={"ID":"d1c8a303-1d3d-4a9d-b1b2-9f6166516cad","Type":"ContainerDied","Data":"cd8b56895985f067c46df81495394127186bb48cd37697cfe0a260961bfec331"} Apr 16 22:16:51.602074 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.601995 2565 scope.go:117] "RemoveContainer" containerID="db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e" Apr 16 22:16:51.610229 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.610207 2565 scope.go:117] "RemoveContainer" containerID="db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e" Apr 16 22:16:51.610516 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:16:51.610493 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e\": container with ID starting with db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e not found: ID does not exist" containerID="db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e" Apr 16 22:16:51.610575 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.610523 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e"} err="failed to get container status \"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e\": rpc error: code = NotFound desc = could not find container \"db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e\": container with ID starting with db90dc69139879531a23599244a2fb1d751429caed06ae2623333b9aeb5e7b2e not found: ID does not exist" Apr 16 22:16:51.622511 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.622488 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:16:51.628662 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:51.628642 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7fb78ff8db-pl67h"] Apr 16 22:16:53.018123 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:16:53.018088 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" path="/var/lib/kubelet/pods/d1c8a303-1d3d-4a9d-b1b2-9f6166516cad/volumes" Apr 16 22:17:33.734585 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:33.734537 2565 generic.go:358] "Generic (PLEG): container finished" podID="5450699d-c1e9-4234-a7b9-c9440f986830" containerID="b24386b4468c56b9790deb998489abd5ddfe2ab9be3985d1262e2fe2dbe6b394" exitCode=0 Apr 16 22:17:33.735112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:33.734604 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" event={"ID":"5450699d-c1e9-4234-a7b9-c9440f986830","Type":"ContainerDied","Data":"b24386b4468c56b9790deb998489abd5ddfe2ab9be3985d1262e2fe2dbe6b394"} Apr 16 22:17:33.735112 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:33.734980 2565 scope.go:117] "RemoveContainer" containerID="b24386b4468c56b9790deb998489abd5ddfe2ab9be3985d1262e2fe2dbe6b394" Apr 16 22:17:34.738877 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:34.738843 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pnp6h" event={"ID":"5450699d-c1e9-4234-a7b9-c9440f986830","Type":"ContainerStarted","Data":"85dc9225ef67aae7507d46c9e72de727565194068fb9245bada0adfd479390ac"} Apr 16 22:17:53.748027 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.747988 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:53.748508 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748423 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="alertmanager" containerID="cri-o://c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" gracePeriod=120 Apr 16 22:17:53.748562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748479 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-metric" containerID="cri-o://00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" gracePeriod=120 Apr 16 22:17:53.748562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748504 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-web" containerID="cri-o://ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" gracePeriod=120 Apr 16 22:17:53.748643 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748584 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy" containerID="cri-o://21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" gracePeriod=120 Apr 16 22:17:53.748643 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748590 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="prom-label-proxy" containerID="cri-o://65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" gracePeriod=120 Apr 16 22:17:53.748643 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:53.748568 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="config-reloader" containerID="cri-o://e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" gracePeriod=120 Apr 16 22:17:54.804537 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804503 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" exitCode=0 Apr 16 22:17:54.804537 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804535 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" exitCode=0 Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804546 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" exitCode=0 Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804554 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" exitCode=0 Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804576 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd"} Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec"} Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804625 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7"} Apr 16 22:17:54.804883 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.804634 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034"} Apr 16 22:17:54.987111 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:54.987088 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.060647 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060565 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060647 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060614 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060658 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060687 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060750 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060778 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060804 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk7gl\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.060861 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060839 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060913 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060942 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.060970 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061002 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061038 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config\") pod \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\" (UID: \"4e2b154a-ff32-4fb3-b16d-9f66a5c14404\") " Apr 16 22:17:55.061145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061111 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:55.061475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061141 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:55.061475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061351 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.061475 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061376 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-metrics-client-ca\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.062226 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.061958 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:55.063763 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.063724 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.063763 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.063745 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:55.063931 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.063762 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.064289 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.064263 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out" (OuterVolumeSpecName: "config-out") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:55.064515 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.064491 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.064872 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.064846 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.065238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.065212 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl" (OuterVolumeSpecName: "kube-api-access-vk7gl") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "kube-api-access-vk7gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:55.065315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.065260 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.068008 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.067913 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.074401 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.074378 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config" (OuterVolumeSpecName: "web-config") pod "4e2b154a-ff32-4fb3-b16d-9f66a5c14404" (UID: "4e2b154a-ff32-4fb3-b16d-9f66a5c14404"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:55.162680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162642 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162670 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-cluster-tls-config\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162683 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vk7gl\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-kube-api-access-vk7gl\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162693 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-out\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162702 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-main-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162713 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-config-volume\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162722 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162731 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-alertmanager-main-db\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162740 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-web-config\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162748 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.162891 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.162756 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4e2b154a-ff32-4fb3-b16d-9f66a5c14404-tls-assets\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:17:55.809791 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809761 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" exitCode=0 Apr 16 22:17:55.809791 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809787 2565 generic.go:358] "Generic (PLEG): container finished" podID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerID="ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" exitCode=0 Apr 16 22:17:55.810194 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809834 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9"} Apr 16 22:17:55.810194 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809861 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572"} Apr 16 22:17:55.810194 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809874 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4e2b154a-ff32-4fb3-b16d-9f66a5c14404","Type":"ContainerDied","Data":"722c1ae03e127877b5b343e0f85ae25dcaa965a6e4d0ab3cf91cca8b8505e000"} Apr 16 22:17:55.810194 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809874 2565 scope.go:117] "RemoveContainer" containerID="65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" Apr 16 22:17:55.810194 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.809861 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.820492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.819270 2565 scope.go:117] "RemoveContainer" containerID="00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" Apr 16 22:17:55.826554 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.826531 2565 scope.go:117] "RemoveContainer" containerID="21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" Apr 16 22:17:55.832997 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.832979 2565 scope.go:117] "RemoveContainer" containerID="ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" Apr 16 22:17:55.837685 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.837663 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:55.840306 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.840286 2565 scope.go:117] "RemoveContainer" containerID="e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" Apr 16 22:17:55.841894 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.841874 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:55.846682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.846662 2565 scope.go:117] "RemoveContainer" containerID="c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" Apr 16 22:17:55.852895 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.852878 2565 scope.go:117] "RemoveContainer" containerID="051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848" Apr 16 22:17:55.859090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859069 2565 scope.go:117] "RemoveContainer" containerID="65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" Apr 16 22:17:55.859349 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.859330 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd\": container with ID starting with 65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd not found: ID does not exist" containerID="65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" Apr 16 22:17:55.859398 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859364 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd"} err="failed to get container status \"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd\": rpc error: code = NotFound desc = could not find container \"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd\": container with ID starting with 65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd not found: ID does not exist" Apr 16 22:17:55.859398 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859383 2565 scope.go:117] "RemoveContainer" containerID="00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" Apr 16 22:17:55.859657 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.859637 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9\": container with ID starting with 00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9 not found: ID does not exist" containerID="00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" Apr 16 22:17:55.859699 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859664 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9"} err="failed to get container status \"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9\": rpc error: code = NotFound desc = could not find container \"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9\": container with ID starting with 00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9 not found: ID does not exist" Apr 16 22:17:55.859699 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859681 2565 scope.go:117] "RemoveContainer" containerID="21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" Apr 16 22:17:55.859912 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.859894 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec\": container with ID starting with 21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec not found: ID does not exist" containerID="21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" Apr 16 22:17:55.859965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859915 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec"} err="failed to get container status \"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec\": rpc error: code = NotFound desc = could not find container \"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec\": container with ID starting with 21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec not found: ID does not exist" Apr 16 22:17:55.859965 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.859931 2565 scope.go:117] "RemoveContainer" containerID="ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" Apr 16 22:17:55.860142 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.860125 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572\": container with ID starting with ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572 not found: ID does not exist" containerID="ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" Apr 16 22:17:55.860181 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860148 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572"} err="failed to get container status \"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572\": rpc error: code = NotFound desc = could not find container \"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572\": container with ID starting with ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572 not found: ID does not exist" Apr 16 22:17:55.860181 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860163 2565 scope.go:117] "RemoveContainer" containerID="e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" Apr 16 22:17:55.860351 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.860336 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7\": container with ID starting with e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7 not found: ID does not exist" containerID="e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" Apr 16 22:17:55.860388 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860353 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7"} err="failed to get container status \"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7\": rpc error: code = NotFound desc = could not find container \"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7\": container with ID starting with e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7 not found: ID does not exist" Apr 16 22:17:55.860388 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860364 2565 scope.go:117] "RemoveContainer" containerID="c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" Apr 16 22:17:55.860599 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.860585 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034\": container with ID starting with c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034 not found: ID does not exist" containerID="c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" Apr 16 22:17:55.860639 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860602 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034"} err="failed to get container status \"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034\": rpc error: code = NotFound desc = could not find container \"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034\": container with ID starting with c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034 not found: ID does not exist" Apr 16 22:17:55.860639 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860614 2565 scope.go:117] "RemoveContainer" containerID="051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848" Apr 16 22:17:55.860814 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:17:55.860799 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848\": container with ID starting with 051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848 not found: ID does not exist" containerID="051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848" Apr 16 22:17:55.860850 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860817 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848"} err="failed to get container status \"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848\": rpc error: code = NotFound desc = could not find container \"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848\": container with ID starting with 051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848 not found: ID does not exist" Apr 16 22:17:55.860850 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860828 2565 scope.go:117] "RemoveContainer" containerID="65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd" Apr 16 22:17:55.860996 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860982 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd"} err="failed to get container status \"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd\": rpc error: code = NotFound desc = could not find container \"65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd\": container with ID starting with 65952c16b2cdaf4b1b141f714278db677db295bc49ebd141fcce4ef8c9e19edd not found: ID does not exist" Apr 16 22:17:55.861040 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.860996 2565 scope.go:117] "RemoveContainer" containerID="00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9" Apr 16 22:17:55.861173 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861159 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9"} err="failed to get container status \"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9\": rpc error: code = NotFound desc = could not find container \"00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9\": container with ID starting with 00175334ec34717063f07ba64ff5a4a3573e39db88e65e77809f9ba603320ab9 not found: ID does not exist" Apr 16 22:17:55.861220 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861173 2565 scope.go:117] "RemoveContainer" containerID="21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec" Apr 16 22:17:55.861381 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861362 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec"} err="failed to get container status \"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec\": rpc error: code = NotFound desc = could not find container \"21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec\": container with ID starting with 21c614769eb99ff29c8c4c13001d653717275c57a72e0be5cf532451e90370ec not found: ID does not exist" Apr 16 22:17:55.861446 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861382 2565 scope.go:117] "RemoveContainer" containerID="ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572" Apr 16 22:17:55.861634 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861616 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572"} err="failed to get container status \"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572\": rpc error: code = NotFound desc = could not find container \"ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572\": container with ID starting with ee645b4856087aa7703798fcc90abaa7fa55bf1e04d7d6644451948307485572 not found: ID does not exist" Apr 16 22:17:55.861680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861634 2565 scope.go:117] "RemoveContainer" containerID="e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7" Apr 16 22:17:55.861816 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861802 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7"} err="failed to get container status \"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7\": rpc error: code = NotFound desc = could not find container \"e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7\": container with ID starting with e9c1e48554932d71a5811ef520be722d4e4bc2bc19cf68fff1e44010aec3ded7 not found: ID does not exist" Apr 16 22:17:55.861853 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861816 2565 scope.go:117] "RemoveContainer" containerID="c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034" Apr 16 22:17:55.862000 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.861983 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034"} err="failed to get container status \"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034\": rpc error: code = NotFound desc = could not find container \"c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034\": container with ID starting with c068af563d34cd05792730a1b4156eff49364b6d00e4898117b069a5cbdec034 not found: ID does not exist" Apr 16 22:17:55.862039 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.862001 2565 scope.go:117] "RemoveContainer" containerID="051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848" Apr 16 22:17:55.862209 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.862193 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848"} err="failed to get container status \"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848\": rpc error: code = NotFound desc = could not find container \"051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848\": container with ID starting with 051b7cfaceb1c174d34489bc7d4aaa87661a539b0f6d9e85c1d724744fcf3848 not found: ID does not exist" Apr 16 22:17:55.866738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.866717 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:55.867005 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.866993 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerName="registry" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867007 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerName="registry" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867021 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="init-config-reloader" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867026 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="init-config-reloader" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867033 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-metric" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867039 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-metric" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867048 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="alertmanager" Apr 16 22:17:55.867056 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867053 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="alertmanager" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867062 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-web" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867068 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-web" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867073 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867078 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867084 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="config-reloader" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867090 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="config-reloader" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867096 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="prom-label-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867100 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="prom-label-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867145 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="prom-label-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867154 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="alertmanager" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867160 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1c8a303-1d3d-4a9d-b1b2-9f6166516cad" containerName="registry" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867166 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="config-reloader" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867172 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-web" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867178 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy" Apr 16 22:17:55.867305 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.867184 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" containerName="kube-rbac-proxy-metric" Apr 16 22:17:55.872699 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.872681 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.875460 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875439 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:17:55.875576 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875558 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:17:55.875656 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875596 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-89h67\"" Apr 16 22:17:55.875713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875686 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:17:55.875764 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875715 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:17:55.875818 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875800 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:17:55.875871 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875840 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:17:55.875871 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.875855 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:17:55.876064 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.876046 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:17:55.880470 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.880448 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:17:55.882288 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.882240 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:55.968936 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.968900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-out\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.968936 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.968938 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-web-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.968955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.968992 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969061 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969115 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969159 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969493 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969197 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969493 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969227 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969493 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969252 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969493 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969288 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr84f\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-kube-api-access-dr84f\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:55.969493 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:55.969329 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070198 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070198 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070150 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070198 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070198 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070323 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070447 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070565 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070537 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr84f\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-kube-api-access-dr84f\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070602 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070699 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-out\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070733 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-web-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070757 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.070796 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.070762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.071380 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.071358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.072141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.071844 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.073322 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.073299 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.073977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.073852 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.073977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.073898 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.074245 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.074207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.074245 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.074217 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.074335 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.074278 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.074335 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.074316 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.074398 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.074357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-config-out\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.075562 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.075547 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-web-config\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.080096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.080071 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr84f\" (UniqueName: \"kubernetes.io/projected/5563b85d-55cd-4297-b5e3-6cdfeaed90f0-kube-api-access-dr84f\") pod \"alertmanager-main-0\" (UID: \"5563b85d-55cd-4297-b5e3-6cdfeaed90f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.185477 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.185443 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:56.316975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.316939 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:56.319985 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:17:56.319954 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5563b85d_55cd_4297_b5e3_6cdfeaed90f0.slice/crio-c266d8913753ae68417fdff94224052bdd08904f5bb5bb750780e08ae97b9185 WatchSource:0}: Error finding container c266d8913753ae68417fdff94224052bdd08904f5bb5bb750780e08ae97b9185: Status 404 returned error can't find the container with id c266d8913753ae68417fdff94224052bdd08904f5bb5bb750780e08ae97b9185 Apr 16 22:17:56.814303 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.814272 2565 generic.go:358] "Generic (PLEG): container finished" podID="5563b85d-55cd-4297-b5e3-6cdfeaed90f0" containerID="f84831ff40354fb4a0ab7791b2e833da091df96a9905a607d7d3cf44545ca106" exitCode=0 Apr 16 22:17:56.814752 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.814370 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerDied","Data":"f84831ff40354fb4a0ab7791b2e833da091df96a9905a607d7d3cf44545ca106"} Apr 16 22:17:56.814752 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:56.814433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"c266d8913753ae68417fdff94224052bdd08904f5bb5bb750780e08ae97b9185"} Apr 16 22:17:57.019788 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.019636 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2b154a-ff32-4fb3-b16d-9f66a5c14404" path="/var/lib/kubelet/pods/4e2b154a-ff32-4fb3-b16d-9f66a5c14404/volumes" Apr 16 22:17:57.773539 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.773499 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-794b697c49-nqs59"] Apr 16 22:17:57.777318 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.777291 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.779819 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779765 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 22:17:57.779819 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779795 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-df48k\"" Apr 16 22:17:57.780012 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779795 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 22:17:57.780012 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779848 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 22:17:57.780012 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779799 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 22:17:57.780012 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.779954 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 22:17:57.787143 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.787119 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 22:17:57.791404 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.791381 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-794b697c49-nqs59"] Apr 16 22:17:57.823217 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.822455 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"3fead92de05c5344e82469257f4627488a7e2b96a86f5f48037b6b33ac0fea75"} Apr 16 22:17:57.823682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.823256 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"aae3101d3765eee799e8f851d9411833a4959f9bb60fd643427fbb241850299d"} Apr 16 22:17:57.823682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.823276 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"d045965099ac3d60b9b00c6463382285540f2bee867ddb6f5a3fbba4bad9eaa4"} Apr 16 22:17:57.823682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.823287 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"eaba61916d8ab4f59917d10569e3cba2a124e37cba1594aac3c762ce22e59f13"} Apr 16 22:17:57.823682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.823299 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"4e5f68bd6b3ddfc50929383218354a8e1832acce1e15178bc932fe648030706b"} Apr 16 22:17:57.823682 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.823314 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5563b85d-55cd-4297-b5e3-6cdfeaed90f0","Type":"ContainerStarted","Data":"b92ce623d8ce2a4d586d33ac8eff4e71ac6b85e8d695d0f20daa5d485cd3cc0b"} Apr 16 22:17:57.850303 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.850249 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.850234109 podStartE2EDuration="2.850234109s" podCreationTimestamp="2026-04-16 22:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:17:57.849340835 +0000 UTC m=+251.395082262" watchObservedRunningTime="2026-04-16 22:17:57.850234109 +0000 UTC m=+251.395975529" Apr 16 22:17:57.887433 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887382 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.887603 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-metrics-client-ca\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.887683 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887637 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-federate-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.887778 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887755 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.887869 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887796 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.887954 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.888021 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-serving-certs-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.888021 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.887998 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwr2m\" (UniqueName: \"kubernetes.io/projected/680e304d-24fc-4033-a171-3051d17c3af2-kube-api-access-cwr2m\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.988825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988781 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.988825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988826 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-metrics-client-ca\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-federate-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988888 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988905 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.988967 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-serving-certs-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989342 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.989182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwr2m\" (UniqueName: \"kubernetes.io/projected/680e304d-24fc-4033-a171-3051d17c3af2-kube-api-access-cwr2m\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.989933 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.989899 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.990130 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.990102 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-serving-certs-ca-bundle\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.990282 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.990267 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/680e304d-24fc-4033-a171-3051d17c3af2-metrics-client-ca\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.991622 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.991593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.991753 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.991736 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.991829 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.991811 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-telemeter-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.991969 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.991949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/680e304d-24fc-4033-a171-3051d17c3af2-federate-client-tls\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:57.997147 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:57.997127 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwr2m\" (UniqueName: \"kubernetes.io/projected/680e304d-24fc-4033-a171-3051d17c3af2-kube-api-access-cwr2m\") pod \"telemeter-client-794b697c49-nqs59\" (UID: \"680e304d-24fc-4033-a171-3051d17c3af2\") " pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:58.089006 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:58.088912 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" Apr 16 22:17:58.237132 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:58.237088 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-794b697c49-nqs59"] Apr 16 22:17:58.239252 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:17:58.239208 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680e304d_24fc_4033_a171_3051d17c3af2.slice/crio-1967d8b31c343c11c4e58083a8456f31226394fb3526262dc209444fad024695 WatchSource:0}: Error finding container 1967d8b31c343c11c4e58083a8456f31226394fb3526262dc209444fad024695: Status 404 returned error can't find the container with id 1967d8b31c343c11c4e58083a8456f31226394fb3526262dc209444fad024695 Apr 16 22:17:58.827725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:58.827684 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" event={"ID":"680e304d-24fc-4033-a171-3051d17c3af2","Type":"ContainerStarted","Data":"1967d8b31c343c11c4e58083a8456f31226394fb3526262dc209444fad024695"} Apr 16 22:17:58.899232 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:58.899195 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:17:58.901769 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:58.901743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06e94b1-2063-48f2-b8a7-0d0e4193f064-metrics-certs\") pod \"network-metrics-daemon-qz5vc\" (UID: \"e06e94b1-2063-48f2-b8a7-0d0e4193f064\") " pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:17:59.117455 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:59.117355 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zhbqt\"" Apr 16 22:17:59.125853 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:59.125811 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qz5vc" Apr 16 22:17:59.262031 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:59.262002 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qz5vc"] Apr 16 22:17:59.265063 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:17:59.265023 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode06e94b1_2063_48f2_b8a7_0d0e4193f064.slice/crio-c35a342bd14c68e420178d1875d7b8e7b1b4a3bf97c21e604fff2b41c81248b5 WatchSource:0}: Error finding container c35a342bd14c68e420178d1875d7b8e7b1b4a3bf97c21e604fff2b41c81248b5: Status 404 returned error can't find the container with id c35a342bd14c68e420178d1875d7b8e7b1b4a3bf97c21e604fff2b41c81248b5 Apr 16 22:17:59.833480 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:59.833450 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" event={"ID":"680e304d-24fc-4033-a171-3051d17c3af2","Type":"ContainerStarted","Data":"4f8908ff5754f64dbaf361e74c4a954b25ce8d6205da02938b1079d0a533d364"} Apr 16 22:17:59.834572 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:17:59.834541 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qz5vc" event={"ID":"e06e94b1-2063-48f2-b8a7-0d0e4193f064","Type":"ContainerStarted","Data":"c35a342bd14c68e420178d1875d7b8e7b1b4a3bf97c21e604fff2b41c81248b5"} Apr 16 22:18:00.839758 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.839668 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qz5vc" event={"ID":"e06e94b1-2063-48f2-b8a7-0d0e4193f064","Type":"ContainerStarted","Data":"91f50ce8b7c7ce341116f1dba2f164d858a372bc09c083732f5e6438c447dc74"} Apr 16 22:18:00.839758 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.839708 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qz5vc" event={"ID":"e06e94b1-2063-48f2-b8a7-0d0e4193f064","Type":"ContainerStarted","Data":"8d4fa1503cc8c2d33090da0478d398f620142d9611240e5179daede4b002b36a"} Apr 16 22:18:00.841431 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.841396 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" event={"ID":"680e304d-24fc-4033-a171-3051d17c3af2","Type":"ContainerStarted","Data":"1e35f02fbe4f08ad24e668a1506b2d97ba17f0b35898b3b2b5e64152e45758e0"} Apr 16 22:18:00.841529 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.841439 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" event={"ID":"680e304d-24fc-4033-a171-3051d17c3af2","Type":"ContainerStarted","Data":"8729cfa769d94addecbdfa296969029f5b48e4bc653d91fa829b7626da1c7ac1"} Apr 16 22:18:00.873928 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.873870 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qz5vc" podStartSLOduration=252.721633835 podStartE2EDuration="4m13.873854764s" podCreationTimestamp="2026-04-16 22:13:47 +0000 UTC" firstStartedPulling="2026-04-16 22:17:59.267069375 +0000 UTC m=+252.812810787" lastFinishedPulling="2026-04-16 22:18:00.419290311 +0000 UTC m=+253.965031716" observedRunningTime="2026-04-16 22:18:00.854213094 +0000 UTC m=+254.399954522" watchObservedRunningTime="2026-04-16 22:18:00.873854764 +0000 UTC m=+254.419596224" Apr 16 22:18:00.874324 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:00.874300 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-794b697c49-nqs59" podStartSLOduration=2.363812026 podStartE2EDuration="3.8742911s" podCreationTimestamp="2026-04-16 22:17:57 +0000 UTC" firstStartedPulling="2026-04-16 22:17:58.24117911 +0000 UTC m=+251.786920515" lastFinishedPulling="2026-04-16 22:17:59.751658164 +0000 UTC m=+253.297399589" observedRunningTime="2026-04-16 22:18:00.871986281 +0000 UTC m=+254.417727719" watchObservedRunningTime="2026-04-16 22:18:00.8742911 +0000 UTC m=+254.420032535" Apr 16 22:18:46.918623 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:18:46.918596 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:19:57.140092 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.140011 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dtz9c"] Apr 16 22:19:57.142333 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.142316 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.144532 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.144513 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:19:57.150560 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.150539 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dtz9c"] Apr 16 22:19:57.238105 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.238068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d29e3c30-fd45-44b0-9689-dac93ee43db6-original-pull-secret\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.238272 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.238126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-dbus\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.238272 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.238151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-kubelet-config\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.339333 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.339284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d29e3c30-fd45-44b0-9689-dac93ee43db6-original-pull-secret\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.339512 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.339364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-dbus\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.339512 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.339388 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-kubelet-config\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.339585 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.339522 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-kubelet-config\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.339585 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.339546 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d29e3c30-fd45-44b0-9689-dac93ee43db6-dbus\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.341580 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.341554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d29e3c30-fd45-44b0-9689-dac93ee43db6-original-pull-secret\") pod \"global-pull-secret-syncer-dtz9c\" (UID: \"d29e3c30-fd45-44b0-9689-dac93ee43db6\") " pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.452242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.452213 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dtz9c" Apr 16 22:19:57.572599 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.572567 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dtz9c"] Apr 16 22:19:57.573190 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:19:57.573166 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29e3c30_fd45_44b0_9689_dac93ee43db6.slice/crio-9a0a34c4b17ee3888130826cc4ab30b689d8656e10fe4ac8de8713614820a6e9 WatchSource:0}: Error finding container 9a0a34c4b17ee3888130826cc4ab30b689d8656e10fe4ac8de8713614820a6e9: Status 404 returned error can't find the container with id 9a0a34c4b17ee3888130826cc4ab30b689d8656e10fe4ac8de8713614820a6e9 Apr 16 22:19:57.575055 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:57.575040 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:19:58.183954 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:19:58.183914 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dtz9c" event={"ID":"d29e3c30-fd45-44b0-9689-dac93ee43db6","Type":"ContainerStarted","Data":"9a0a34c4b17ee3888130826cc4ab30b689d8656e10fe4ac8de8713614820a6e9"} Apr 16 22:20:02.197095 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:02.197057 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dtz9c" event={"ID":"d29e3c30-fd45-44b0-9689-dac93ee43db6","Type":"ContainerStarted","Data":"0b17ac383ed847bc35a5601e02c78664c192ddf51693f8302ada171c5180db22"} Apr 16 22:20:02.215594 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:02.215539 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dtz9c" podStartSLOduration=1.49485683 podStartE2EDuration="5.215525823s" podCreationTimestamp="2026-04-16 22:19:57 +0000 UTC" firstStartedPulling="2026-04-16 22:19:57.575173829 +0000 UTC m=+371.120915234" lastFinishedPulling="2026-04-16 22:20:01.295842823 +0000 UTC m=+374.841584227" observedRunningTime="2026-04-16 22:20:02.213921032 +0000 UTC m=+375.759662452" watchObservedRunningTime="2026-04-16 22:20:02.215525823 +0000 UTC m=+375.761267242" Apr 16 22:20:39.952856 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.952823 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m"] Apr 16 22:20:39.955291 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.955271 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:39.957681 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.957653 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:20:39.957898 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.957874 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:20:39.958529 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.958511 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-kgx4k\"" Apr 16 22:20:39.958659 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.958644 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:20:39.969128 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:39.969104 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m"] Apr 16 22:20:40.096280 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.096232 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.096499 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.096371 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7wn\" (UniqueName: \"kubernetes.io/projected/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-kube-api-access-8q7wn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.196838 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.196792 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7wn\" (UniqueName: \"kubernetes.io/projected/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-kube-api-access-8q7wn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.197001 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.196874 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.199115 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.199096 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.205207 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.205145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7wn\" (UniqueName: \"kubernetes.io/projected/6874b8c9-1ca9-4bcd-a06f-4deabf58ac12-kube-api-access-8q7wn\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m\" (UID: \"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.265455 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.265427 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:40.388342 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:40.388316 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m"] Apr 16 22:20:40.392272 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:20:40.392244 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6874b8c9_1ca9_4bcd_a06f_4deabf58ac12.slice/crio-4781736639be98159745b629c0f033a967fdca0372970d79a8e72ea34fec2175 WatchSource:0}: Error finding container 4781736639be98159745b629c0f033a967fdca0372970d79a8e72ea34fec2175: Status 404 returned error can't find the container with id 4781736639be98159745b629c0f033a967fdca0372970d79a8e72ea34fec2175 Apr 16 22:20:41.315078 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:41.315045 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" event={"ID":"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12","Type":"ContainerStarted","Data":"4781736639be98159745b629c0f033a967fdca0372970d79a8e72ea34fec2175"} Apr 16 22:20:45.057633 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.057600 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vtw7z"] Apr 16 22:20:45.060225 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.060204 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.062738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.062718 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:20:45.062857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.062716 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 22:20:45.062857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.062779 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2ddzk\"" Apr 16 22:20:45.068613 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.068590 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vtw7z"] Apr 16 22:20:45.133704 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.133664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-cabundle0\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.133904 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.133717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.133904 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.133798 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vmp\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-kube-api-access-75vmp\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.234337 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.234279 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75vmp\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-kube-api-access-75vmp\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.234547 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.234347 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-cabundle0\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.234547 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.234387 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.234547 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.234529 2565 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 22:20:45.234547 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.234547 2565 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:45.234744 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.234557 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:45.234744 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.234573 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vtw7z: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 22:20:45.234744 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.234630 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates podName:ee1d91ca-f5ed-43d3-afe5-99538ca06f70 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:45.734609195 +0000 UTC m=+419.280350604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates") pod "keda-operator-ffbb595cb-vtw7z" (UID: "ee1d91ca-f5ed-43d3-afe5-99538ca06f70") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 22:20:45.235396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.235374 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-cabundle0\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.242929 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.242899 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vmp\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-kube-api-access-75vmp\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.333924 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.333837 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" event={"ID":"6874b8c9-1ca9-4bcd-a06f-4deabf58ac12","Type":"ContainerStarted","Data":"bcfbdc4f047e7d95fa5ded72257fd402c412f5b3de6afe41b6d34fb0506bcf83"} Apr 16 22:20:45.334156 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.334008 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:20:45.355266 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.355205 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" podStartSLOduration=2.227129888 podStartE2EDuration="6.355188422s" podCreationTimestamp="2026-04-16 22:20:39 +0000 UTC" firstStartedPulling="2026-04-16 22:20:40.394156606 +0000 UTC m=+413.939898011" lastFinishedPulling="2026-04-16 22:20:44.522215139 +0000 UTC m=+418.067956545" observedRunningTime="2026-04-16 22:20:45.354683455 +0000 UTC m=+418.900424885" watchObservedRunningTime="2026-04-16 22:20:45.355188422 +0000 UTC m=+418.900929849" Apr 16 22:20:45.380492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.380461 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck"] Apr 16 22:20:45.383283 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.383255 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.388011 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.387990 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 22:20:45.394695 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.394674 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck"] Apr 16 22:20:45.435154 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.435103 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.435356 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.435328 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mvb\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-kube-api-access-45mvb\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.435533 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.435512 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99788344-c9ae-404a-b10b-bf5afcda04c1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.536717 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.536677 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.536901 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.536727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45mvb\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-kube-api-access-45mvb\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.536901 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.536776 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99788344-c9ae-404a-b10b-bf5afcda04c1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.536901 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.536824 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:45.536901 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.536847 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:45.536901 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.536866 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck: references non-existent secret key: tls.crt Apr 16 22:20:45.537093 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.536923 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates podName:99788344-c9ae-404a-b10b-bf5afcda04c1 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:46.036907163 +0000 UTC m=+419.582648569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates") pod "keda-metrics-apiserver-7c9f485588-xhkck" (UID: "99788344-c9ae-404a-b10b-bf5afcda04c1") : references non-existent secret key: tls.crt Apr 16 22:20:45.537180 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.537159 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99788344-c9ae-404a-b10b-bf5afcda04c1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.545692 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.545654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mvb\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-kube-api-access-45mvb\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:45.738299 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.738249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:45.738535 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.738416 2565 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:45.738535 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.738440 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:45.738535 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.738450 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vtw7z: references non-existent secret key: ca.crt Apr 16 22:20:45.738535 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:45.738506 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates podName:ee1d91ca-f5ed-43d3-afe5-99538ca06f70 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:46.738491872 +0000 UTC m=+420.284233276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates") pod "keda-operator-ffbb595cb-vtw7z" (UID: "ee1d91ca-f5ed-43d3-afe5-99538ca06f70") : references non-existent secret key: ca.crt Apr 16 22:20:45.750105 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.750070 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-lqvn7"] Apr 16 22:20:45.752818 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.752797 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.755082 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.755059 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 22:20:45.763245 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.763210 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lqvn7"] Apr 16 22:20:45.838998 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.838960 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-certificates\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.839187 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.839167 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dc6\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-kube-api-access-72dc6\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.939923 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.939880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-certificates\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.940106 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.940039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72dc6\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-kube-api-access-72dc6\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.944288 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.943603 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-certificates\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:45.948505 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:45.948480 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dc6\" (UniqueName: \"kubernetes.io/projected/cab41558-8834-43bf-878f-fe1278565930-kube-api-access-72dc6\") pod \"keda-admission-cf49989db-lqvn7\" (UID: \"cab41558-8834-43bf-878f-fe1278565930\") " pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:46.041484 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:46.041378 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:46.041661 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.041571 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:46.041661 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.041593 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:46.041661 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.041614 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck: references non-existent secret key: tls.crt Apr 16 22:20:46.041838 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.041684 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates podName:99788344-c9ae-404a-b10b-bf5afcda04c1 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:47.041663986 +0000 UTC m=+420.587405411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates") pod "keda-metrics-apiserver-7c9f485588-xhkck" (UID: "99788344-c9ae-404a-b10b-bf5afcda04c1") : references non-existent secret key: tls.crt Apr 16 22:20:46.065076 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:46.065036 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:46.210566 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:46.210537 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lqvn7"] Apr 16 22:20:46.213761 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:20:46.213734 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab41558_8834_43bf_878f_fe1278565930.slice/crio-3a6292baf3890ac18195cf95bea43a64700343ca3b89fc1f5785a1831acfedfe WatchSource:0}: Error finding container 3a6292baf3890ac18195cf95bea43a64700343ca3b89fc1f5785a1831acfedfe: Status 404 returned error can't find the container with id 3a6292baf3890ac18195cf95bea43a64700343ca3b89fc1f5785a1831acfedfe Apr 16 22:20:46.338600 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:46.338509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lqvn7" event={"ID":"cab41558-8834-43bf-878f-fe1278565930","Type":"ContainerStarted","Data":"3a6292baf3890ac18195cf95bea43a64700343ca3b89fc1f5785a1831acfedfe"} Apr 16 22:20:46.748664 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:46.748628 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:46.748839 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.748752 2565 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:46.748839 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.748765 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:46.748839 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.748773 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vtw7z: references non-existent secret key: ca.crt Apr 16 22:20:46.748839 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:46.748829 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates podName:ee1d91ca-f5ed-43d3-afe5-99538ca06f70 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:48.748815177 +0000 UTC m=+422.294556582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates") pod "keda-operator-ffbb595cb-vtw7z" (UID: "ee1d91ca-f5ed-43d3-afe5-99538ca06f70") : references non-existent secret key: ca.crt Apr 16 22:20:47.051374 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:47.051290 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:47.051558 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:47.051456 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:47.051558 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:47.051474 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:47.051558 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:47.051494 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck: references non-existent secret key: tls.crt Apr 16 22:20:47.051733 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:47.051567 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates podName:99788344-c9ae-404a-b10b-bf5afcda04c1 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:49.051534656 +0000 UTC m=+422.597276084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates") pod "keda-metrics-apiserver-7c9f485588-xhkck" (UID: "99788344-c9ae-404a-b10b-bf5afcda04c1") : references non-existent secret key: tls.crt Apr 16 22:20:48.768325 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:48.768236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:48.768728 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:48.768386 2565 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:48.768728 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:48.768404 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:48.768728 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:48.768432 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vtw7z: references non-existent secret key: ca.crt Apr 16 22:20:48.768728 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:48.768499 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates podName:ee1d91ca-f5ed-43d3-afe5-99538ca06f70 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:52.768479902 +0000 UTC m=+426.314221321 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates") pod "keda-operator-ffbb595cb-vtw7z" (UID: "ee1d91ca-f5ed-43d3-afe5-99538ca06f70") : references non-existent secret key: ca.crt Apr 16 22:20:49.070877 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:49.070788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:49.071062 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:49.070955 2565 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:49.071062 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:49.070977 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:49.071062 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:49.071003 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck: references non-existent secret key: tls.crt Apr 16 22:20:49.071231 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:20:49.071069 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates podName:99788344-c9ae-404a-b10b-bf5afcda04c1 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:53.071050116 +0000 UTC m=+426.616791521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates") pod "keda-metrics-apiserver-7c9f485588-xhkck" (UID: "99788344-c9ae-404a-b10b-bf5afcda04c1") : references non-existent secret key: tls.crt Apr 16 22:20:49.354799 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:49.354702 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lqvn7" event={"ID":"cab41558-8834-43bf-878f-fe1278565930","Type":"ContainerStarted","Data":"dd93159c4a320632ca0896d0b19b048dcd4668d54030f3f1b0637c2d2250269f"} Apr 16 22:20:49.354979 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:49.354852 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:20:49.373042 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:49.372984 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-lqvn7" podStartSLOduration=2.1675248639999998 podStartE2EDuration="4.37297091s" podCreationTimestamp="2026-04-16 22:20:45 +0000 UTC" firstStartedPulling="2026-04-16 22:20:46.215027205 +0000 UTC m=+419.760768614" lastFinishedPulling="2026-04-16 22:20:48.420473256 +0000 UTC m=+421.966214660" observedRunningTime="2026-04-16 22:20:49.371321303 +0000 UTC m=+422.917062732" watchObservedRunningTime="2026-04-16 22:20:49.37297091 +0000 UTC m=+422.918712336" Apr 16 22:20:52.803890 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:52.803846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:52.806256 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:52.806230 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ee1d91ca-f5ed-43d3-afe5-99538ca06f70-certificates\") pod \"keda-operator-ffbb595cb-vtw7z\" (UID: \"ee1d91ca-f5ed-43d3-afe5-99538ca06f70\") " pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:52.871428 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:52.871377 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:53.004607 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.004584 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vtw7z"] Apr 16 22:20:53.007052 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:20:53.007023 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1d91ca_f5ed_43d3_afe5_99538ca06f70.slice/crio-1a4c200484ca901c47be040eac37a1aadadb3e7744381542063a25176e3a7e1f WatchSource:0}: Error finding container 1a4c200484ca901c47be040eac37a1aadadb3e7744381542063a25176e3a7e1f: Status 404 returned error can't find the container with id 1a4c200484ca901c47be040eac37a1aadadb3e7744381542063a25176e3a7e1f Apr 16 22:20:53.106986 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.106897 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:53.109568 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.109532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99788344-c9ae-404a-b10b-bf5afcda04c1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xhkck\" (UID: \"99788344-c9ae-404a-b10b-bf5afcda04c1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:53.196554 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.196499 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:53.315649 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.315597 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck"] Apr 16 22:20:53.317904 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:20:53.317865 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99788344_c9ae_404a_b10b_bf5afcda04c1.slice/crio-76cb52d38a56e0fbd14605e9b071bc3b538592c56a4ec3ceef4da393e9db306a WatchSource:0}: Error finding container 76cb52d38a56e0fbd14605e9b071bc3b538592c56a4ec3ceef4da393e9db306a: Status 404 returned error can't find the container with id 76cb52d38a56e0fbd14605e9b071bc3b538592c56a4ec3ceef4da393e9db306a Apr 16 22:20:53.369357 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.369273 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" event={"ID":"ee1d91ca-f5ed-43d3-afe5-99538ca06f70","Type":"ContainerStarted","Data":"1a4c200484ca901c47be040eac37a1aadadb3e7744381542063a25176e3a7e1f"} Apr 16 22:20:53.370280 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:53.370255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" event={"ID":"99788344-c9ae-404a-b10b-bf5afcda04c1","Type":"ContainerStarted","Data":"76cb52d38a56e0fbd14605e9b071bc3b538592c56a4ec3ceef4da393e9db306a"} Apr 16 22:20:57.385848 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.385803 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" event={"ID":"99788344-c9ae-404a-b10b-bf5afcda04c1","Type":"ContainerStarted","Data":"2024a252e04c09cf7f2581fe16c30522642a5ccf65ad7a435addf301e17cdb39"} Apr 16 22:20:57.386324 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.386049 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:20:57.387141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.387121 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" event={"ID":"ee1d91ca-f5ed-43d3-afe5-99538ca06f70","Type":"ContainerStarted","Data":"3ba1156b0b2ae813a323afe622c2e50c57a6d76f764e9b9571bbe4c187b190c3"} Apr 16 22:20:57.387326 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.387311 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:20:57.402596 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.402552 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" podStartSLOduration=9.024944693 podStartE2EDuration="12.402539185s" podCreationTimestamp="2026-04-16 22:20:45 +0000 UTC" firstStartedPulling="2026-04-16 22:20:53.319539533 +0000 UTC m=+426.865280938" lastFinishedPulling="2026-04-16 22:20:56.69713402 +0000 UTC m=+430.242875430" observedRunningTime="2026-04-16 22:20:57.400572768 +0000 UTC m=+430.946314194" watchObservedRunningTime="2026-04-16 22:20:57.402539185 +0000 UTC m=+430.948280613" Apr 16 22:20:57.416788 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:20:57.416748 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" podStartSLOduration=8.724012135 podStartE2EDuration="12.416737602s" podCreationTimestamp="2026-04-16 22:20:45 +0000 UTC" firstStartedPulling="2026-04-16 22:20:53.008323101 +0000 UTC m=+426.554064506" lastFinishedPulling="2026-04-16 22:20:56.701048568 +0000 UTC m=+430.246789973" observedRunningTime="2026-04-16 22:20:57.414928997 +0000 UTC m=+430.960670424" watchObservedRunningTime="2026-04-16 22:20:57.416737602 +0000 UTC m=+430.962479029" Apr 16 22:21:06.341224 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:21:06.341191 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9fp8m" Apr 16 22:21:08.394276 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:21:08.394249 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xhkck" Apr 16 22:21:10.360626 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:21:10.360589 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-lqvn7" Apr 16 22:21:18.392132 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:21:18.392093 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-vtw7z" Apr 16 22:23:37.913744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.913708 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:23:37.916262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.916236 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:37.919533 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.919512 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 16 22:23:37.919665 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.919521 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:23:37.920065 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.920050 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:23:37.920118 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.920082 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 16 22:23:37.920470 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.920398 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:23:37.927806 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.927780 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:23:37.992686 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.992653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:37.992852 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.992695 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:37.992852 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.992724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:37.992852 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:37.992836 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfsm\" (UniqueName: \"kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.094267 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.094232 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.094460 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.094285 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.094460 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.094347 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfsm\" (UniqueName: \"kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.094460 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.094425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.094851 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.094822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.095246 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.095208 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.097073 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.097030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.108455 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.108401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfsm\" (UniqueName: \"kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.226800 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.226769 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:23:38.370914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.370859 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:23:38.373445 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:23:38.373395 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b913f1_1bb0_4011_ace6_1dd747747614.slice/crio-4360916fb6b3e6771390e8e4b31b17e6c50acc57ac082024da716cd6409f9693 WatchSource:0}: Error finding container 4360916fb6b3e6771390e8e4b31b17e6c50acc57ac082024da716cd6409f9693: Status 404 returned error can't find the container with id 4360916fb6b3e6771390e8e4b31b17e6c50acc57ac082024da716cd6409f9693 Apr 16 22:23:38.915483 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:38.915444 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerStarted","Data":"4360916fb6b3e6771390e8e4b31b17e6c50acc57ac082024da716cd6409f9693"} Apr 16 22:23:41.929330 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:41.929287 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerStarted","Data":"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a"} Apr 16 22:23:45.942772 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:45.942736 2565 generic.go:358] "Generic (PLEG): container finished" podID="76b913f1-1bb0-4011-ace6-1dd747747614" containerID="0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a" exitCode=0 Apr 16 22:23:45.943141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:45.942812 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerDied","Data":"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a"} Apr 16 22:23:58.996951 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:23:58.996905 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerStarted","Data":"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24"} Apr 16 22:24:01.006167 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:01.006134 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerStarted","Data":"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25"} Apr 16 22:24:01.006546 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:01.006435 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:24:01.023686 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:01.023632 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podStartSLOduration=1.5426110990000002 podStartE2EDuration="24.023617192s" podCreationTimestamp="2026-04-16 22:23:37 +0000 UTC" firstStartedPulling="2026-04-16 22:23:38.37556955 +0000 UTC m=+591.921310969" lastFinishedPulling="2026-04-16 22:24:00.856575641 +0000 UTC m=+614.402317062" observedRunningTime="2026-04-16 22:24:01.023390921 +0000 UTC m=+614.569132349" watchObservedRunningTime="2026-04-16 22:24:01.023617192 +0000 UTC m=+614.569358620" Apr 16 22:24:02.010171 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:02.010138 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:24:02.011261 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:02.011236 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:03.013331 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:03.013277 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:08.017358 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:08.017327 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:24:08.017874 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:08.017848 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:18.018713 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:18.018675 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:28.017908 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:28.017868 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:38.017870 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:38.017819 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:48.021247 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:48.021195 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:57.872759 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:57.872719 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:24:57.875042 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:57.875026 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:57.877253 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:57.877232 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-37a6e-kube-rbac-proxy-sar-config\"" Apr 16 22:24:57.877367 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:57.877231 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-37a6e-serving-cert\"" Apr 16 22:24:57.885185 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:57.885161 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:24:58.012615 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.012573 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.012802 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.012649 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.018149 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.018113 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 16 22:24:58.114098 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.114055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.114279 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.114116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.114321 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:24:58.114297 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-37a6e-serving-cert: secret "switch-graph-37a6e-serving-cert" not found Apr 16 22:24:58.114509 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:24:58.114493 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls podName:6ac66a6a-9486-4437-af45-90124dd85e73 nodeName:}" failed. No retries permitted until 2026-04-16 22:24:58.614470066 +0000 UTC m=+672.160211475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls") pod "switch-graph-37a6e-87cd6b4d8-vs6ng" (UID: "6ac66a6a-9486-4437-af45-90124dd85e73") : secret "switch-graph-37a6e-serving-cert" not found Apr 16 22:24:58.114938 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.114914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.617309 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.617267 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.619767 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.619740 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") pod \"switch-graph-37a6e-87cd6b4d8-vs6ng\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.786126 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.786090 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:24:58.904026 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.903873 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:24:58.906751 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:24:58.906723 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac66a6a_9486_4437_af45_90124dd85e73.slice/crio-54ee2e920a42829d4c597cf9efda6bc00ab96041ec6448f3c1765f67ac2205fc WatchSource:0}: Error finding container 54ee2e920a42829d4c597cf9efda6bc00ab96041ec6448f3c1765f67ac2205fc: Status 404 returned error can't find the container with id 54ee2e920a42829d4c597cf9efda6bc00ab96041ec6448f3c1765f67ac2205fc Apr 16 22:24:58.908484 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:58.908466 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:24:59.196441 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:24:59.196378 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" event={"ID":"6ac66a6a-9486-4437-af45-90124dd85e73","Type":"ContainerStarted","Data":"54ee2e920a42829d4c597cf9efda6bc00ab96041ec6448f3c1765f67ac2205fc"} Apr 16 22:25:02.208793 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:02.208757 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" event={"ID":"6ac66a6a-9486-4437-af45-90124dd85e73","Type":"ContainerStarted","Data":"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01"} Apr 16 22:25:02.209166 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:02.208889 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:25:02.225302 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:02.225252 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podStartSLOduration=2.736721828 podStartE2EDuration="5.225236462s" podCreationTimestamp="2026-04-16 22:24:57 +0000 UTC" firstStartedPulling="2026-04-16 22:24:58.908594105 +0000 UTC m=+672.454335511" lastFinishedPulling="2026-04-16 22:25:01.39710874 +0000 UTC m=+674.942850145" observedRunningTime="2026-04-16 22:25:02.223215307 +0000 UTC m=+675.768956734" watchObservedRunningTime="2026-04-16 22:25:02.225236462 +0000 UTC m=+675.770977889" Apr 16 22:25:08.018589 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:08.018561 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:25:08.218915 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:08.218882 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:25:12.063978 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:12.063948 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:25:12.064385 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:12.064156 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" containerID="cri-o://0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01" gracePeriod=30 Apr 16 22:25:13.215677 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:13.215638 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:18.216329 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:18.216293 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:23.216361 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:23.216318 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:23.216789 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:23.216454 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:25:28.216328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:28.216288 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:33.215970 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:33.215873 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:37.859133 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.859101 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:25:37.865689 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.865669 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:37.867985 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.867963 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 16 22:25:37.868089 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.867965 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 16 22:25:37.872096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.872074 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:25:37.958656 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.958628 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:37.958827 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:37.958668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.059070 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.059039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.059259 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.059076 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.059738 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.059720 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.061401 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.061383 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls\") pod \"model-chainer-7689c79995-cdt2f\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.177177 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.177142 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:38.217293 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.217251 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:38.295017 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.294984 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:25:38.298248 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:25:38.298219 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247e04b5_97a7_4061_85b6_1afc7cd1a18a.slice/crio-1d88598b8b9d4529bd32a7af16cf116aa7fd515f8666d26684d34d64c8d5ab3a WatchSource:0}: Error finding container 1d88598b8b9d4529bd32a7af16cf116aa7fd515f8666d26684d34d64c8d5ab3a: Status 404 returned error can't find the container with id 1d88598b8b9d4529bd32a7af16cf116aa7fd515f8666d26684d34d64c8d5ab3a Apr 16 22:25:38.335645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:38.335610 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" event={"ID":"247e04b5-97a7-4061-85b6-1afc7cd1a18a","Type":"ContainerStarted","Data":"1d88598b8b9d4529bd32a7af16cf116aa7fd515f8666d26684d34d64c8d5ab3a"} Apr 16 22:25:39.340058 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:39.340017 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" event={"ID":"247e04b5-97a7-4061-85b6-1afc7cd1a18a","Type":"ContainerStarted","Data":"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f"} Apr 16 22:25:39.340058 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:39.340058 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:39.355456 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:39.355385 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podStartSLOduration=2.355371586 podStartE2EDuration="2.355371586s" podCreationTimestamp="2026-04-16 22:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:25:39.354519713 +0000 UTC m=+712.900261140" watchObservedRunningTime="2026-04-16 22:25:39.355371586 +0000 UTC m=+712.901113013" Apr 16 22:25:42.211900 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.211877 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:25:42.351188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.351093 2565 generic.go:358] "Generic (PLEG): container finished" podID="6ac66a6a-9486-4437-af45-90124dd85e73" containerID="0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01" exitCode=0 Apr 16 22:25:42.351188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.351156 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" Apr 16 22:25:42.351188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.351179 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" event={"ID":"6ac66a6a-9486-4437-af45-90124dd85e73","Type":"ContainerDied","Data":"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01"} Apr 16 22:25:42.351459 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.351216 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng" event={"ID":"6ac66a6a-9486-4437-af45-90124dd85e73","Type":"ContainerDied","Data":"54ee2e920a42829d4c597cf9efda6bc00ab96041ec6448f3c1765f67ac2205fc"} Apr 16 22:25:42.351459 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.351232 2565 scope.go:117] "RemoveContainer" containerID="0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01" Apr 16 22:25:42.360643 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.360622 2565 scope.go:117] "RemoveContainer" containerID="0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01" Apr 16 22:25:42.360988 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:25:42.360966 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01\": container with ID starting with 0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01 not found: ID does not exist" containerID="0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01" Apr 16 22:25:42.361042 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.360998 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01"} err="failed to get container status \"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01\": rpc error: code = NotFound desc = could not find container \"0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01\": container with ID starting with 0e3e121e092d77f8d2d458d20b02d5d1eb80815363cd45421b25d5fbe3520e01 not found: ID does not exist" Apr 16 22:25:42.394606 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.394569 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") pod \"6ac66a6a-9486-4437-af45-90124dd85e73\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " Apr 16 22:25:42.394784 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.394660 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle\") pod \"6ac66a6a-9486-4437-af45-90124dd85e73\" (UID: \"6ac66a6a-9486-4437-af45-90124dd85e73\") " Apr 16 22:25:42.395041 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.395016 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6ac66a6a-9486-4437-af45-90124dd85e73" (UID: "6ac66a6a-9486-4437-af45-90124dd85e73"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:42.396855 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.396826 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6ac66a6a-9486-4437-af45-90124dd85e73" (UID: "6ac66a6a-9486-4437-af45-90124dd85e73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:42.495951 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.495918 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ac66a6a-9486-4437-af45-90124dd85e73-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:42.495951 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.495945 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ac66a6a-9486-4437-af45-90124dd85e73-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:42.672548 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.672517 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:25:42.674955 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:42.674928 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-37a6e-87cd6b4d8-vs6ng"] Apr 16 22:25:43.018817 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:43.018785 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" path="/var/lib/kubelet/pods/6ac66a6a-9486-4437-af45-90124dd85e73/volumes" Apr 16 22:25:45.349681 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:45.349652 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:25:47.946476 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:47.946447 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:25:47.946828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:47.946664 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" containerID="cri-o://89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f" gracePeriod=30 Apr 16 22:25:48.113649 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:48.113616 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:25:48.113976 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:48.113951 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" containerID="cri-o://761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24" gracePeriod=30 Apr 16 22:25:48.114050 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:48.113986 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kube-rbac-proxy" containerID="cri-o://53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25" gracePeriod=30 Apr 16 22:25:48.375043 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:48.375008 2565 generic.go:358] "Generic (PLEG): container finished" podID="76b913f1-1bb0-4011-ace6-1dd747747614" containerID="53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25" exitCode=2 Apr 16 22:25:48.375213 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:48.375081 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerDied","Data":"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25"} Apr 16 22:25:50.348158 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:50.348120 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:52.662337 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.662312 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:25:52.679618 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.679592 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location\") pod \"76b913f1-1bb0-4011-ace6-1dd747747614\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " Apr 16 22:25:52.679754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.679639 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls\") pod \"76b913f1-1bb0-4011-ace6-1dd747747614\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " Apr 16 22:25:52.679754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.679655 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmfsm\" (UniqueName: \"kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm\") pod \"76b913f1-1bb0-4011-ace6-1dd747747614\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " Apr 16 22:25:52.679754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.679680 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"76b913f1-1bb0-4011-ace6-1dd747747614\" (UID: \"76b913f1-1bb0-4011-ace6-1dd747747614\") " Apr 16 22:25:52.679998 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.679959 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "76b913f1-1bb0-4011-ace6-1dd747747614" (UID: "76b913f1-1bb0-4011-ace6-1dd747747614"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:25:52.680061 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.680039 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "76b913f1-1bb0-4011-ace6-1dd747747614" (UID: "76b913f1-1bb0-4011-ace6-1dd747747614"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:25:52.682175 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.682134 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm" (OuterVolumeSpecName: "kube-api-access-mmfsm") pod "76b913f1-1bb0-4011-ace6-1dd747747614" (UID: "76b913f1-1bb0-4011-ace6-1dd747747614"). InnerVolumeSpecName "kube-api-access-mmfsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:52.682402 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.682375 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "76b913f1-1bb0-4011-ace6-1dd747747614" (UID: "76b913f1-1bb0-4011-ace6-1dd747747614"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:25:52.781187 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.781151 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76b913f1-1bb0-4011-ace6-1dd747747614-kserve-provision-location\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.781187 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.781181 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76b913f1-1bb0-4011-ace6-1dd747747614-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.781187 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.781191 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmfsm\" (UniqueName: \"kubernetes.io/projected/76b913f1-1bb0-4011-ace6-1dd747747614-kube-api-access-mmfsm\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:52.781440 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:52.781207 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/76b913f1-1bb0-4011-ace6-1dd747747614-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:25:53.393495 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.393362 2565 generic.go:358] "Generic (PLEG): container finished" podID="76b913f1-1bb0-4011-ace6-1dd747747614" containerID="761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24" exitCode=0 Apr 16 22:25:53.393674 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.393531 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerDied","Data":"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24"} Apr 16 22:25:53.393674 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.393603 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" event={"ID":"76b913f1-1bb0-4011-ace6-1dd747747614","Type":"ContainerDied","Data":"4360916fb6b3e6771390e8e4b31b17e6c50acc57ac082024da716cd6409f9693"} Apr 16 22:25:53.393674 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.393629 2565 scope.go:117] "RemoveContainer" containerID="53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25" Apr 16 22:25:53.393806 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.393720 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp" Apr 16 22:25:53.408227 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.408203 2565 scope.go:117] "RemoveContainer" containerID="761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24" Apr 16 22:25:53.414553 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.414521 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:25:53.416280 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.416263 2565 scope.go:117] "RemoveContainer" containerID="0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a" Apr 16 22:25:53.417887 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.417870 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-f8qcp"] Apr 16 22:25:53.423576 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.423562 2565 scope.go:117] "RemoveContainer" containerID="53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25" Apr 16 22:25:53.423851 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:25:53.423828 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25\": container with ID starting with 53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25 not found: ID does not exist" containerID="53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25" Apr 16 22:25:53.423890 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.423867 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25"} err="failed to get container status \"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25\": rpc error: code = NotFound desc = could not find container \"53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25\": container with ID starting with 53025ec1f7344da6110241017a9d1594f00b3e8b07b9b8c30476dacaa86d6f25 not found: ID does not exist" Apr 16 22:25:53.423890 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.423888 2565 scope.go:117] "RemoveContainer" containerID="761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24" Apr 16 22:25:53.424134 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:25:53.424113 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24\": container with ID starting with 761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24 not found: ID does not exist" containerID="761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24" Apr 16 22:25:53.424226 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.424139 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24"} err="failed to get container status \"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24\": rpc error: code = NotFound desc = could not find container \"761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24\": container with ID starting with 761110078dcf558dbfaae29c615a8b7d030af0e711de93c9f6c5c2e579797f24 not found: ID does not exist" Apr 16 22:25:53.424226 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.424155 2565 scope.go:117] "RemoveContainer" containerID="0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a" Apr 16 22:25:53.424361 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:25:53.424346 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a\": container with ID starting with 0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a not found: ID does not exist" containerID="0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a" Apr 16 22:25:53.424397 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:53.424366 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a"} err="failed to get container status \"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a\": rpc error: code = NotFound desc = could not find container \"0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a\": container with ID starting with 0952d3e47b52bf115d7bb6107eccf960f739eb6bf815e33f915dd2e1e08f987a not found: ID does not exist" Apr 16 22:25:55.018169 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:55.018137 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" path="/var/lib/kubelet/pods/76b913f1-1bb0-4011-ace6-1dd747747614/volumes" Apr 16 22:25:55.348637 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:25:55.348549 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:00.347842 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:00.347801 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:00.348290 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:00.347948 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:26:05.348608 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:05.348565 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:10.348509 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:10.348473 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:12.284228 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284189 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:26:12.284749 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284733 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" Apr 16 22:26:12.284792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284753 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" Apr 16 22:26:12.284792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284771 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="storage-initializer" Apr 16 22:26:12.284792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284780 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="storage-initializer" Apr 16 22:26:12.284792 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284790 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" Apr 16 22:26:12.284914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284799 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" Apr 16 22:26:12.284914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284808 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kube-rbac-proxy" Apr 16 22:26:12.284914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284816 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kube-rbac-proxy" Apr 16 22:26:12.284914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284898 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kserve-container" Apr 16 22:26:12.285026 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284913 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ac66a6a-9486-4437-af45-90124dd85e73" containerName="switch-graph-37a6e" Apr 16 22:26:12.285026 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.284925 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="76b913f1-1bb0-4011-ace6-1dd747747614" containerName="kube-rbac-proxy" Apr 16 22:26:12.288243 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.288226 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.290591 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.290570 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-73a06-kube-rbac-proxy-sar-config\"" Apr 16 22:26:12.290725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.290571 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-73a06-serving-cert\"" Apr 16 22:26:12.294500 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.294465 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:26:12.349822 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.349790 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.349989 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.349841 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.450701 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.450669 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.450856 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.450717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.450856 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:26:12.450837 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-73a06-serving-cert: secret "switch-graph-73a06-serving-cert" not found Apr 16 22:26:12.450925 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:26:12.450890 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls podName:a7affbed-79cd-4589-8ac5-9f1f2e4614c7 nodeName:}" failed. No retries permitted until 2026-04-16 22:26:12.950874688 +0000 UTC m=+746.496616093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls") pod "switch-graph-73a06-86f764f5f7-2d27q" (UID: "a7affbed-79cd-4589-8ac5-9f1f2e4614c7") : secret "switch-graph-73a06-serving-cert" not found Apr 16 22:26:12.451306 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.451287 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.954721 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.954687 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:12.957008 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:12.956988 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") pod \"switch-graph-73a06-86f764f5f7-2d27q\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:13.200049 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.200010 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:13.318673 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.318641 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:26:13.322156 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:26:13.322129 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7affbed_79cd_4589_8ac5_9f1f2e4614c7.slice/crio-0690e4d0874fd128afe7107903fd5097e80d83b4888f4c5033547b874202bd56 WatchSource:0}: Error finding container 0690e4d0874fd128afe7107903fd5097e80d83b4888f4c5033547b874202bd56: Status 404 returned error can't find the container with id 0690e4d0874fd128afe7107903fd5097e80d83b4888f4c5033547b874202bd56 Apr 16 22:26:13.460870 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.460772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" event={"ID":"a7affbed-79cd-4589-8ac5-9f1f2e4614c7","Type":"ContainerStarted","Data":"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d"} Apr 16 22:26:13.460870 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.460810 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" event={"ID":"a7affbed-79cd-4589-8ac5-9f1f2e4614c7","Type":"ContainerStarted","Data":"0690e4d0874fd128afe7107903fd5097e80d83b4888f4c5033547b874202bd56"} Apr 16 22:26:13.461096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.460908 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:13.477209 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:13.477165 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podStartSLOduration=1.477151724 podStartE2EDuration="1.477151724s" podCreationTimestamp="2026-04-16 22:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:26:13.475619001 +0000 UTC m=+747.021360428" watchObservedRunningTime="2026-04-16 22:26:13.477151724 +0000 UTC m=+747.022893150" Apr 16 22:26:15.347999 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:15.347962 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:18.118188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.118165 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:26:18.198951 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.198913 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle\") pod \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " Apr 16 22:26:18.199145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.199057 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls\") pod \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\" (UID: \"247e04b5-97a7-4061-85b6-1afc7cd1a18a\") " Apr 16 22:26:18.199328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.199300 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "247e04b5-97a7-4061-85b6-1afc7cd1a18a" (UID: "247e04b5-97a7-4061-85b6-1afc7cd1a18a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:26:18.201122 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.201056 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "247e04b5-97a7-4061-85b6-1afc7cd1a18a" (UID: "247e04b5-97a7-4061-85b6-1afc7cd1a18a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:26:18.299645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.299614 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247e04b5-97a7-4061-85b6-1afc7cd1a18a-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:26:18.299645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.299644 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247e04b5-97a7-4061-85b6-1afc7cd1a18a-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:26:18.478604 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.478515 2565 generic.go:358] "Generic (PLEG): container finished" podID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerID="89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f" exitCode=137 Apr 16 22:26:18.478604 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.478598 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" Apr 16 22:26:18.478813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.478604 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" event={"ID":"247e04b5-97a7-4061-85b6-1afc7cd1a18a","Type":"ContainerDied","Data":"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f"} Apr 16 22:26:18.478813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.478641 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f" event={"ID":"247e04b5-97a7-4061-85b6-1afc7cd1a18a","Type":"ContainerDied","Data":"1d88598b8b9d4529bd32a7af16cf116aa7fd515f8666d26684d34d64c8d5ab3a"} Apr 16 22:26:18.478813 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.478658 2565 scope.go:117] "RemoveContainer" containerID="89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f" Apr 16 22:26:18.487097 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.487079 2565 scope.go:117] "RemoveContainer" containerID="89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f" Apr 16 22:26:18.487361 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:26:18.487342 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f\": container with ID starting with 89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f not found: ID does not exist" containerID="89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f" Apr 16 22:26:18.487461 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.487369 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f"} err="failed to get container status \"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f\": rpc error: code = NotFound desc = could not find container \"89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f\": container with ID starting with 89c28e148e80eb6c4b98595979ebd93d15099cc428dcbf655c11cfe5d21ef80f not found: ID does not exist" Apr 16 22:26:18.513636 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.513608 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:26:18.517179 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:18.517158 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-7689c79995-cdt2f"] Apr 16 22:26:19.018855 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:19.018816 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" path="/var/lib/kubelet/pods/247e04b5-97a7-4061-85b6-1afc7cd1a18a/volumes" Apr 16 22:26:19.470614 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:19.470590 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:26:48.126327 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.126291 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:26:48.126859 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.126653 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" Apr 16 22:26:48.126859 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.126665 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" Apr 16 22:26:48.126859 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.126733 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="247e04b5-97a7-4061-85b6-1afc7cd1a18a" containerName="model-chainer" Apr 16 22:26:48.131985 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.131966 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.134396 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.134372 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d7c57-serving-cert\"" Apr 16 22:26:48.134510 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.134392 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d7c57-kube-rbac-proxy-sar-config\"" Apr 16 22:26:48.138692 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.138667 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:26:48.142940 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.142919 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.143059 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.142966 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.243899 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.243865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.244097 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.243920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.244097 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:26:48.244021 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-d7c57-serving-cert: secret "sequence-graph-d7c57-serving-cert" not found Apr 16 22:26:48.244097 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:26:48.244093 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls podName:b7d17826-40ba-46c2-8055-f67fd9c02482 nodeName:}" failed. No retries permitted until 2026-04-16 22:26:48.744074721 +0000 UTC m=+782.289816127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls") pod "sequence-graph-d7c57-59b4ff5bcf-29w6w" (UID: "b7d17826-40ba-46c2-8055-f67fd9c02482") : secret "sequence-graph-d7c57-serving-cert" not found Apr 16 22:26:48.244596 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.244573 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.748341 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.748292 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:48.750765 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:48.750735 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") pod \"sequence-graph-d7c57-59b4ff5bcf-29w6w\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:49.044201 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.044125 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:49.162030 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.161998 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:26:49.164647 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:26:49.164621 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d17826_40ba_46c2_8055_f67fd9c02482.slice/crio-c89185a6e5bed069abb3c74297f2d7fdb00f1db1c13d48827e8fedcb704a5b61 WatchSource:0}: Error finding container c89185a6e5bed069abb3c74297f2d7fdb00f1db1c13d48827e8fedcb704a5b61: Status 404 returned error can't find the container with id c89185a6e5bed069abb3c74297f2d7fdb00f1db1c13d48827e8fedcb704a5b61 Apr 16 22:26:49.580662 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.580624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" event={"ID":"b7d17826-40ba-46c2-8055-f67fd9c02482","Type":"ContainerStarted","Data":"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443"} Apr 16 22:26:49.580662 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.580659 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" event={"ID":"b7d17826-40ba-46c2-8055-f67fd9c02482","Type":"ContainerStarted","Data":"c89185a6e5bed069abb3c74297f2d7fdb00f1db1c13d48827e8fedcb704a5b61"} Apr 16 22:26:49.580930 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.580736 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:26:49.596698 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:49.596657 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podStartSLOduration=1.596642592 podStartE2EDuration="1.596642592s" podCreationTimestamp="2026-04-16 22:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:26:49.594983065 +0000 UTC m=+783.140724497" watchObservedRunningTime="2026-04-16 22:26:49.596642592 +0000 UTC m=+783.142384032" Apr 16 22:26:55.590262 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:26:55.590228 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:34:26.787656 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:26.787622 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:34:26.790087 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:26.787892 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" containerID="cri-o://8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d" gracePeriod=30 Apr 16 22:34:29.469295 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:29.469249 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:34.468731 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:34.468643 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:39.468656 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:39.468615 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:39.469023 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:39.468733 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:34:44.469788 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:44.469743 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:49.469196 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:49.469151 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:54.469110 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:54.469073 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:34:56.937300 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:56.937275 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:34:57.068616 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.068527 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle\") pod \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " Apr 16 22:34:57.068616 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.068582 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") pod \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\" (UID: \"a7affbed-79cd-4589-8ac5-9f1f2e4614c7\") " Apr 16 22:34:57.068947 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.068924 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a7affbed-79cd-4589-8ac5-9f1f2e4614c7" (UID: "a7affbed-79cd-4589-8ac5-9f1f2e4614c7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:34:57.070598 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.070576 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a7affbed-79cd-4589-8ac5-9f1f2e4614c7" (UID: "a7affbed-79cd-4589-8ac5-9f1f2e4614c7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:57.169590 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.169549 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:34:57.169590 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.169581 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7affbed-79cd-4589-8ac5-9f1f2e4614c7-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:34:57.234555 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.234523 2565 generic.go:358] "Generic (PLEG): container finished" podID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerID="8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d" exitCode=0 Apr 16 22:34:57.234721 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.234588 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" event={"ID":"a7affbed-79cd-4589-8ac5-9f1f2e4614c7","Type":"ContainerDied","Data":"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d"} Apr 16 22:34:57.234721 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.234613 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" event={"ID":"a7affbed-79cd-4589-8ac5-9f1f2e4614c7","Type":"ContainerDied","Data":"0690e4d0874fd128afe7107903fd5097e80d83b4888f4c5033547b874202bd56"} Apr 16 22:34:57.234721 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.234610 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q" Apr 16 22:34:57.234721 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.234627 2565 scope.go:117] "RemoveContainer" containerID="8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d" Apr 16 22:34:57.244810 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.244793 2565 scope.go:117] "RemoveContainer" containerID="8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d" Apr 16 22:34:57.245046 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:34:57.245024 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d\": container with ID starting with 8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d not found: ID does not exist" containerID="8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d" Apr 16 22:34:57.245090 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.245071 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d"} err="failed to get container status \"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d\": rpc error: code = NotFound desc = could not find container \"8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d\": container with ID starting with 8b6e35ca43dc1b341d2b586299e687693881dfbbfd0ee55266c875654ea2e77d not found: ID does not exist" Apr 16 22:34:57.255741 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.255715 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:34:57.258573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:57.258551 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-73a06-86f764f5f7-2d27q"] Apr 16 22:34:59.018641 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:34:59.018604 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" path="/var/lib/kubelet/pods/a7affbed-79cd-4589-8ac5-9f1f2e4614c7/volumes" Apr 16 22:35:02.707238 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:02.707207 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:35:02.707703 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:02.707473 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" containerID="cri-o://a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443" gracePeriod=30 Apr 16 22:35:05.588626 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:05.588585 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:10.588118 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:10.588068 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:15.589041 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:15.588999 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:15.589464 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:15.589115 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:35:20.588340 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:20.588286 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:25.588095 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:25.588054 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:30.588341 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:30.588298 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:32.849448 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:32.849419 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:35:32.977174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:32.977099 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") pod \"b7d17826-40ba-46c2-8055-f67fd9c02482\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " Apr 16 22:35:32.977174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:32.977146 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle\") pod \"b7d17826-40ba-46c2-8055-f67fd9c02482\" (UID: \"b7d17826-40ba-46c2-8055-f67fd9c02482\") " Apr 16 22:35:32.977540 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:32.977516 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b7d17826-40ba-46c2-8055-f67fd9c02482" (UID: "b7d17826-40ba-46c2-8055-f67fd9c02482"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:35:32.979174 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:32.979152 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b7d17826-40ba-46c2-8055-f67fd9c02482" (UID: "b7d17826-40ba-46c2-8055-f67fd9c02482"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:35:33.078725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.078682 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b7d17826-40ba-46c2-8055-f67fd9c02482-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:35:33.078725 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.078709 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7d17826-40ba-46c2-8055-f67fd9c02482-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:35:33.356467 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.356362 2565 generic.go:358] "Generic (PLEG): container finished" podID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerID="a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443" exitCode=0 Apr 16 22:35:33.356467 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.356424 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" event={"ID":"b7d17826-40ba-46c2-8055-f67fd9c02482","Type":"ContainerDied","Data":"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443"} Apr 16 22:35:33.356467 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.356449 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" event={"ID":"b7d17826-40ba-46c2-8055-f67fd9c02482","Type":"ContainerDied","Data":"c89185a6e5bed069abb3c74297f2d7fdb00f1db1c13d48827e8fedcb704a5b61"} Apr 16 22:35:33.356467 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.356452 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w" Apr 16 22:35:33.356467 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.356467 2565 scope.go:117] "RemoveContainer" containerID="a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443" Apr 16 22:35:33.364781 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.364762 2565 scope.go:117] "RemoveContainer" containerID="a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443" Apr 16 22:35:33.365050 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:35:33.365028 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443\": container with ID starting with a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443 not found: ID does not exist" containerID="a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443" Apr 16 22:35:33.365115 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.365064 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443"} err="failed to get container status \"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443\": rpc error: code = NotFound desc = could not find container \"a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443\": container with ID starting with a07b89cc57778b76e091e914db9e9076269194985b258d1be703de7797c24443 not found: ID does not exist" Apr 16 22:35:33.370814 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.370791 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:35:33.374571 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:33.374547 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d7c57-59b4ff5bcf-29w6w"] Apr 16 22:35:35.018691 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:35.018659 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" path="/var/lib/kubelet/pods/b7d17826-40ba-46c2-8055-f67fd9c02482/volumes" Apr 16 22:35:37.019135 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019053 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019371 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019382 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019404 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019429 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019488 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7affbed-79cd-4589-8ac5-9f1f2e4614c7" containerName="switch-graph-73a06" Apr 16 22:35:37.019504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.019500 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7d17826-40ba-46c2-8055-f67fd9c02482" containerName="sequence-graph-d7c57" Apr 16 22:35:37.023746 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.023726 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.025975 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.025941 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-1148f-kube-rbac-proxy-sar-config\"" Apr 16 22:35:37.026779 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.026750 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-1148f-serving-cert\"" Apr 16 22:35:37.026888 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.026804 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:35:37.027044 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.027025 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:35:37.028318 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.028252 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:35:37.111500 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.111461 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.111677 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.111568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.212023 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.211985 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.212218 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.212069 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.212218 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:35:37.212214 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-1148f-serving-cert: secret "ensemble-graph-1148f-serving-cert" not found Apr 16 22:35:37.212316 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:35:37.212293 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls podName:2737eaab-e5eb-401a-83a3-5e484c72a4b7 nodeName:}" failed. No retries permitted until 2026-04-16 22:35:37.712271589 +0000 UTC m=+1311.258013000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls") pod "ensemble-graph-1148f-6b77456c7-qktwk" (UID: "2737eaab-e5eb-401a-83a3-5e484c72a4b7") : secret "ensemble-graph-1148f-serving-cert" not found Apr 16 22:35:37.212734 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.212705 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.717584 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.717538 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.719885 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.719852 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") pod \"ensemble-graph-1148f-6b77456c7-qktwk\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:37.934877 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:37.934842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:38.060354 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.060328 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:35:38.062784 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:35:38.062757 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2737eaab_e5eb_401a_83a3_5e484c72a4b7.slice/crio-165724913dba364b70f13b5db779d3de6cf081066397a1f620973e59480342a8 WatchSource:0}: Error finding container 165724913dba364b70f13b5db779d3de6cf081066397a1f620973e59480342a8: Status 404 returned error can't find the container with id 165724913dba364b70f13b5db779d3de6cf081066397a1f620973e59480342a8 Apr 16 22:35:38.064471 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.064453 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:35:38.374769 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.374682 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" event={"ID":"2737eaab-e5eb-401a-83a3-5e484c72a4b7","Type":"ContainerStarted","Data":"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb"} Apr 16 22:35:38.374769 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.374715 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" event={"ID":"2737eaab-e5eb-401a-83a3-5e484c72a4b7","Type":"ContainerStarted","Data":"165724913dba364b70f13b5db779d3de6cf081066397a1f620973e59480342a8"} Apr 16 22:35:38.374962 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.374821 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:38.392084 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:38.392026 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podStartSLOduration=1.3920075619999999 podStartE2EDuration="1.392007562s" podCreationTimestamp="2026-04-16 22:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:35:38.390240915 +0000 UTC m=+1311.935982341" watchObservedRunningTime="2026-04-16 22:35:38.392007562 +0000 UTC m=+1311.937748990" Apr 16 22:35:44.383552 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:44.383521 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:35:47.080820 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:47.080787 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:35:47.081148 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:47.081060 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" containerID="cri-o://2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb" gracePeriod=30 Apr 16 22:35:49.382243 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:49.382202 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:54.381621 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:54.381581 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:59.382060 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:59.382023 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:35:59.382455 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:35:59.382148 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:36:02.873820 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:02.873789 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:02.878362 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:02.878340 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:02.880542 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:02.880518 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cf933-kube-rbac-proxy-sar-config\"" Apr 16 22:36:02.880673 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:02.880655 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cf933-serving-cert\"" Apr 16 22:36:02.891737 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:02.891713 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:03.040224 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.040196 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.040404 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.040254 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.141617 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.141527 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.141617 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.141614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.141841 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:03.141752 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-cf933-serving-cert: secret "sequence-graph-cf933-serving-cert" not found Apr 16 22:36:03.141908 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:03.141840 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls podName:453f42e1-2887-4b91-8933-979038d46a47 nodeName:}" failed. No retries permitted until 2026-04-16 22:36:03.641814799 +0000 UTC m=+1337.187556227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls") pod "sequence-graph-cf933-55f8c8c9b7-kskdk" (UID: "453f42e1-2887-4b91-8933-979038d46a47") : secret "sequence-graph-cf933-serving-cert" not found Apr 16 22:36:03.142274 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.142252 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.645810 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.645768 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.648187 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.648158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") pod \"sequence-graph-cf933-55f8c8c9b7-kskdk\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.788464 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.788427 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:03.912117 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:03.912092 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:03.914214 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:36:03.914180 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453f42e1_2887_4b91_8933_979038d46a47.slice/crio-d7db14143e983e4ff5495ec6118fbcc80849e22063a5a714667e73004886ca81 WatchSource:0}: Error finding container d7db14143e983e4ff5495ec6118fbcc80849e22063a5a714667e73004886ca81: Status 404 returned error can't find the container with id d7db14143e983e4ff5495ec6118fbcc80849e22063a5a714667e73004886ca81 Apr 16 22:36:04.381693 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:04.381656 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:04.461249 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:04.461213 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" event={"ID":"453f42e1-2887-4b91-8933-979038d46a47","Type":"ContainerStarted","Data":"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be"} Apr 16 22:36:04.461249 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:04.461254 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" event={"ID":"453f42e1-2887-4b91-8933-979038d46a47","Type":"ContainerStarted","Data":"d7db14143e983e4ff5495ec6118fbcc80849e22063a5a714667e73004886ca81"} Apr 16 22:36:04.461484 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:04.461273 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:04.476478 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:04.476432 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podStartSLOduration=2.476401974 podStartE2EDuration="2.476401974s" podCreationTimestamp="2026-04-16 22:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:04.475288002 +0000 UTC m=+1338.021029430" watchObservedRunningTime="2026-04-16 22:36:04.476401974 +0000 UTC m=+1338.022143400" Apr 16 22:36:09.382424 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:09.382370 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:10.470537 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:10.470508 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:12.933914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:12.929808 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:12.933914 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:12.930154 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" containerID="cri-o://9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be" gracePeriod=30 Apr 16 22:36:14.382459 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:14.382400 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:15.468992 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:15.468951 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:17.245149 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.245125 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:36:17.356405 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.356321 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle\") pod \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " Apr 16 22:36:17.356405 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.356390 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") pod \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\" (UID: \"2737eaab-e5eb-401a-83a3-5e484c72a4b7\") " Apr 16 22:36:17.356710 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.356682 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2737eaab-e5eb-401a-83a3-5e484c72a4b7" (UID: "2737eaab-e5eb-401a-83a3-5e484c72a4b7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:36:17.358439 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.358390 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2737eaab-e5eb-401a-83a3-5e484c72a4b7" (UID: "2737eaab-e5eb-401a-83a3-5e484c72a4b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:36:17.457141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.457110 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2737eaab-e5eb-401a-83a3-5e484c72a4b7-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:36:17.457141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.457136 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2737eaab-e5eb-401a-83a3-5e484c72a4b7-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:36:17.505320 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.505290 2565 generic.go:358] "Generic (PLEG): container finished" podID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerID="2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb" exitCode=0 Apr 16 22:36:17.505492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.505352 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" Apr 16 22:36:17.505492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.505378 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" event={"ID":"2737eaab-e5eb-401a-83a3-5e484c72a4b7","Type":"ContainerDied","Data":"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb"} Apr 16 22:36:17.505492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.505438 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk" event={"ID":"2737eaab-e5eb-401a-83a3-5e484c72a4b7","Type":"ContainerDied","Data":"165724913dba364b70f13b5db779d3de6cf081066397a1f620973e59480342a8"} Apr 16 22:36:17.505492 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.505455 2565 scope.go:117] "RemoveContainer" containerID="2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb" Apr 16 22:36:17.516028 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.514366 2565 scope.go:117] "RemoveContainer" containerID="2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb" Apr 16 22:36:17.516543 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:17.516513 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb\": container with ID starting with 2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb not found: ID does not exist" containerID="2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb" Apr 16 22:36:17.516620 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.516554 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb"} err="failed to get container status \"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb\": rpc error: code = NotFound desc = could not find container \"2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb\": container with ID starting with 2064fd096ddfe5cca84851a74a4a4defcf1202fd94d5c33a117745b2114f55cb not found: ID does not exist" Apr 16 22:36:17.528804 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.528782 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:36:17.534046 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:17.534016 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1148f-6b77456c7-qktwk"] Apr 16 22:36:19.024263 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:19.024228 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" path="/var/lib/kubelet/pods/2737eaab-e5eb-401a-83a3-5e484c72a4b7/volumes" Apr 16 22:36:20.468826 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:20.468790 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:25.468677 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:25.468638 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:25.469098 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:25.468734 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:30.469001 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:30.468960 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:35.469232 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:35.469188 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:40.468921 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:40.468881 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:36:43.080835 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.080804 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:43.178807 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.178779 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle\") pod \"453f42e1-2887-4b91-8933-979038d46a47\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " Apr 16 22:36:43.178997 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.178844 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") pod \"453f42e1-2887-4b91-8933-979038d46a47\" (UID: \"453f42e1-2887-4b91-8933-979038d46a47\") " Apr 16 22:36:43.179270 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.179231 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "453f42e1-2887-4b91-8933-979038d46a47" (UID: "453f42e1-2887-4b91-8933-979038d46a47"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:36:43.180766 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.180745 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "453f42e1-2887-4b91-8933-979038d46a47" (UID: "453f42e1-2887-4b91-8933-979038d46a47"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:36:43.280477 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.280372 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/453f42e1-2887-4b91-8933-979038d46a47-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:36:43.280477 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.280402 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/453f42e1-2887-4b91-8933-979038d46a47-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:36:43.593977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.593879 2565 generic.go:358] "Generic (PLEG): container finished" podID="453f42e1-2887-4b91-8933-979038d46a47" containerID="9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be" exitCode=0 Apr 16 22:36:43.593977 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.593946 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" Apr 16 22:36:43.594160 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.593967 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" event={"ID":"453f42e1-2887-4b91-8933-979038d46a47","Type":"ContainerDied","Data":"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be"} Apr 16 22:36:43.594160 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.594006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk" event={"ID":"453f42e1-2887-4b91-8933-979038d46a47","Type":"ContainerDied","Data":"d7db14143e983e4ff5495ec6118fbcc80849e22063a5a714667e73004886ca81"} Apr 16 22:36:43.594160 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.594021 2565 scope.go:117] "RemoveContainer" containerID="9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be" Apr 16 22:36:43.605334 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.605314 2565 scope.go:117] "RemoveContainer" containerID="9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be" Apr 16 22:36:43.605614 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:43.605595 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be\": container with ID starting with 9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be not found: ID does not exist" containerID="9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be" Apr 16 22:36:43.605680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.605623 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be"} err="failed to get container status \"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be\": rpc error: code = NotFound desc = could not find container \"9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be\": container with ID starting with 9d8f26245e1e6018cbc372b7759a717e32b611a3b1f6ec9aeeb1c58e60cd49be not found: ID does not exist" Apr 16 22:36:43.615937 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.615915 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:43.621327 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:43.621306 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cf933-55f8c8c9b7-kskdk"] Apr 16 22:36:45.018328 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:45.018297 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453f42e1-2887-4b91-8933-979038d46a47" path="/var/lib/kubelet/pods/453f42e1-2887-4b91-8933-979038d46a47/volumes" Apr 16 22:36:47.288509 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288434 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:36:47.288857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288774 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" Apr 16 22:36:47.288857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288786 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" Apr 16 22:36:47.288857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288795 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" Apr 16 22:36:47.288857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288800 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" Apr 16 22:36:47.288857 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288854 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="453f42e1-2887-4b91-8933-979038d46a47" containerName="sequence-graph-cf933" Apr 16 22:36:47.289024 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.288864 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2737eaab-e5eb-401a-83a3-5e484c72a4b7" containerName="ensemble-graph-1148f" Apr 16 22:36:47.293314 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.293292 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:47.295463 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.295441 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ea3bc-kube-rbac-proxy-sar-config\"" Apr 16 22:36:47.295754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.295734 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-ea3bc-serving-cert\"" Apr 16 22:36:47.296057 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.296039 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:36:47.296296 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.296278 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:36:47.301968 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.301947 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:36:47.420503 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.420464 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:47.420671 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.420565 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:47.521035 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.520987 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:47.521240 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.521094 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:47.521240 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:47.521142 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-ea3bc-serving-cert: secret "ensemble-graph-ea3bc-serving-cert" not found Apr 16 22:36:47.526468 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:36:47.521584 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls podName:71f5e738-1e4d-4669-9309-0a07847019fd nodeName:}" failed. No retries permitted until 2026-04-16 22:36:48.021541194 +0000 UTC m=+1381.567282599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls") pod "ensemble-graph-ea3bc-c57584b84-tvrrn" (UID: "71f5e738-1e4d-4669-9309-0a07847019fd") : secret "ensemble-graph-ea3bc-serving-cert" not found Apr 16 22:36:47.526468 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:47.522339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:48.026141 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.026093 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:48.028438 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.028391 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") pod \"ensemble-graph-ea3bc-c57584b84-tvrrn\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:48.204533 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.204496 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:48.322440 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.322360 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:36:48.324870 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:36:48.324838 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f5e738_1e4d_4669_9309_0a07847019fd.slice/crio-73291ea89c839aaee098ffb23600675332c8b2c433d3b67cd145761319622827 WatchSource:0}: Error finding container 73291ea89c839aaee098ffb23600675332c8b2c433d3b67cd145761319622827: Status 404 returned error can't find the container with id 73291ea89c839aaee098ffb23600675332c8b2c433d3b67cd145761319622827 Apr 16 22:36:48.611440 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.611330 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" event={"ID":"71f5e738-1e4d-4669-9309-0a07847019fd","Type":"ContainerStarted","Data":"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c"} Apr 16 22:36:48.611440 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.611367 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" event={"ID":"71f5e738-1e4d-4669-9309-0a07847019fd","Type":"ContainerStarted","Data":"73291ea89c839aaee098ffb23600675332c8b2c433d3b67cd145761319622827"} Apr 16 22:36:48.611626 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.611448 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:36:48.628807 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:48.628764 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podStartSLOduration=1.6287509039999999 podStartE2EDuration="1.628750904s" podCreationTimestamp="2026-04-16 22:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:48.626536592 +0000 UTC m=+1382.172278020" watchObservedRunningTime="2026-04-16 22:36:48.628750904 +0000 UTC m=+1382.174492330" Apr 16 22:36:54.620145 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:36:54.620116 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:37:13.130223 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.130183 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:37:13.133622 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.133603 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.136025 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.135995 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c905c-serving-cert\"" Apr 16 22:37:13.136025 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.135998 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-c905c-kube-rbac-proxy-sar-config\"" Apr 16 22:37:13.143473 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.143453 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:37:13.240521 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.240492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.240684 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.240571 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.341765 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.341727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.341931 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.341789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.342458 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.342439 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.344080 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.344057 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls\") pod \"sequence-graph-c905c-6897947cb6-sj2vc\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.444371 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.444341 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.566833 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.566808 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:37:13.569243 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:37:13.569213 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6c22ca_5e9f_4115_8c67_0992ce3ac532.slice/crio-6ab53f777cd6a1958a0a326cbde403446e6879c78a2e362c3795891c2de7a6a3 WatchSource:0}: Error finding container 6ab53f777cd6a1958a0a326cbde403446e6879c78a2e362c3795891c2de7a6a3: Status 404 returned error can't find the container with id 6ab53f777cd6a1958a0a326cbde403446e6879c78a2e362c3795891c2de7a6a3 Apr 16 22:37:13.701939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.701860 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" event={"ID":"1e6c22ca-5e9f-4115-8c67-0992ce3ac532","Type":"ContainerStarted","Data":"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37"} Apr 16 22:37:13.701939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.701894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" event={"ID":"1e6c22ca-5e9f-4115-8c67-0992ce3ac532","Type":"ContainerStarted","Data":"6ab53f777cd6a1958a0a326cbde403446e6879c78a2e362c3795891c2de7a6a3"} Apr 16 22:37:13.701939 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.701935 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:37:13.718018 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:13.717966 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podStartSLOduration=0.717951026 podStartE2EDuration="717.951026ms" podCreationTimestamp="2026-04-16 22:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:37:13.717389545 +0000 UTC m=+1407.263130975" watchObservedRunningTime="2026-04-16 22:37:13.717951026 +0000 UTC m=+1407.263692457" Apr 16 22:37:19.713524 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:37:19.713493 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:45:01.931034 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:01.931002 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:45:01.931550 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:01.931232 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" containerID="cri-o://1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c" gracePeriod=30 Apr 16 22:45:04.618516 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:04.618467 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:09.618953 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:09.618908 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:14.618101 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:14.618055 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:14.618522 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:14.618203 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:45:19.618515 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:19.618471 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:24.618341 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:24.618294 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:27.738228 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:27.738194 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:45:27.738643 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:27.738518 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" containerID="cri-o://7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37" gracePeriod=30 Apr 16 22:45:29.618680 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:29.618641 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:29.712033 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:29.711992 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:32.076903 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.076877 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:45:32.172248 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.172220 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle\") pod \"71f5e738-1e4d-4669-9309-0a07847019fd\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " Apr 16 22:45:32.172432 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.172289 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") pod \"71f5e738-1e4d-4669-9309-0a07847019fd\" (UID: \"71f5e738-1e4d-4669-9309-0a07847019fd\") " Apr 16 22:45:32.172579 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.172554 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "71f5e738-1e4d-4669-9309-0a07847019fd" (UID: "71f5e738-1e4d-4669-9309-0a07847019fd"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:32.174252 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.174233 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "71f5e738-1e4d-4669-9309-0a07847019fd" (UID: "71f5e738-1e4d-4669-9309-0a07847019fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:32.273559 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.273474 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71f5e738-1e4d-4669-9309-0a07847019fd-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:45:32.273559 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.273503 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71f5e738-1e4d-4669-9309-0a07847019fd-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:45:32.359592 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.359559 2565 generic.go:358] "Generic (PLEG): container finished" podID="71f5e738-1e4d-4669-9309-0a07847019fd" containerID="1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c" exitCode=0 Apr 16 22:45:32.359744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.359630 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" Apr 16 22:45:32.359744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.359641 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" event={"ID":"71f5e738-1e4d-4669-9309-0a07847019fd","Type":"ContainerDied","Data":"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c"} Apr 16 22:45:32.359744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.359679 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn" event={"ID":"71f5e738-1e4d-4669-9309-0a07847019fd","Type":"ContainerDied","Data":"73291ea89c839aaee098ffb23600675332c8b2c433d3b67cd145761319622827"} Apr 16 22:45:32.359744 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.359698 2565 scope.go:117] "RemoveContainer" containerID="1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c" Apr 16 22:45:32.368017 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.367997 2565 scope.go:117] "RemoveContainer" containerID="1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c" Apr 16 22:45:32.368314 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:45:32.368291 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c\": container with ID starting with 1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c not found: ID does not exist" containerID="1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c" Apr 16 22:45:32.368399 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.368323 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c"} err="failed to get container status \"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c\": rpc error: code = NotFound desc = could not find container \"1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c\": container with ID starting with 1f8fe635ad7e61595048c85a717cfa444e9174619eeda6e5d192b2e28ece8e1c not found: ID does not exist" Apr 16 22:45:32.380317 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.380294 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:45:32.383859 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:32.383839 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-ea3bc-c57584b84-tvrrn"] Apr 16 22:45:33.019173 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:33.019135 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" path="/var/lib/kubelet/pods/71f5e738-1e4d-4669-9309-0a07847019fd/volumes" Apr 16 22:45:34.712285 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:34.712246 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:39.711825 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:39.711787 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:39.712231 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:39.711912 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:45:44.711995 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:44.711954 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:49.712195 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:49.712148 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:54.712088 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:54.712043 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:45:57.883827 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:57.883803 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:45:58.014754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.014665 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls\") pod \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " Apr 16 22:45:58.014922 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.014782 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle\") pod \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\" (UID: \"1e6c22ca-5e9f-4115-8c67-0992ce3ac532\") " Apr 16 22:45:58.015135 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.015108 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "1e6c22ca-5e9f-4115-8c67-0992ce3ac532" (UID: "1e6c22ca-5e9f-4115-8c67-0992ce3ac532"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:45:58.016773 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.016751 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1e6c22ca-5e9f-4115-8c67-0992ce3ac532" (UID: "1e6c22ca-5e9f-4115-8c67-0992ce3ac532"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:45:58.116210 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.116171 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:45:58.116498 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.116479 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e6c22ca-5e9f-4115-8c67-0992ce3ac532-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:45:58.449316 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.449276 2565 generic.go:358] "Generic (PLEG): container finished" podID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerID="7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37" exitCode=0 Apr 16 22:45:58.449504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.449339 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" Apr 16 22:45:58.449504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.449346 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" event={"ID":"1e6c22ca-5e9f-4115-8c67-0992ce3ac532","Type":"ContainerDied","Data":"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37"} Apr 16 22:45:58.449504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.449381 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc" event={"ID":"1e6c22ca-5e9f-4115-8c67-0992ce3ac532","Type":"ContainerDied","Data":"6ab53f777cd6a1958a0a326cbde403446e6879c78a2e362c3795891c2de7a6a3"} Apr 16 22:45:58.449504 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.449400 2565 scope.go:117] "RemoveContainer" containerID="7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37" Apr 16 22:45:58.463225 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.463181 2565 scope.go:117] "RemoveContainer" containerID="7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37" Apr 16 22:45:58.463622 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:45:58.463505 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37\": container with ID starting with 7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37 not found: ID does not exist" containerID="7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37" Apr 16 22:45:58.463622 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.463535 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37"} err="failed to get container status \"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37\": rpc error: code = NotFound desc = could not find container \"7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37\": container with ID starting with 7011288b1cb09e7dbd52e9d67b3542df507f2233722c3767d6943913bbe57e37 not found: ID does not exist" Apr 16 22:45:58.475096 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.475071 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:45:58.479276 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:58.479248 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-c905c-6897947cb6-sj2vc"] Apr 16 22:45:59.021009 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:45:59.020969 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" path="/var/lib/kubelet/pods/1e6c22ca-5e9f-4115-8c67-0992ce3ac532/volumes" Apr 16 22:46:02.155152 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155117 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:02.155561 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155511 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" Apr 16 22:46:02.155561 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155526 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" Apr 16 22:46:02.155561 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155543 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" Apr 16 22:46:02.155561 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155548 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" Apr 16 22:46:02.155697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155613 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="71f5e738-1e4d-4669-9309-0a07847019fd" containerName="ensemble-graph-ea3bc" Apr 16 22:46:02.155697 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.155622 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6c22ca-5e9f-4115-8c67-0992ce3ac532" containerName="sequence-graph-c905c" Apr 16 22:46:02.159876 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.159857 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.162217 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.162190 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a5b4b-serving-cert\"" Apr 16 22:46:02.162315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.162240 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4hqzk\"" Apr 16 22:46:02.162315 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.162258 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a5b4b-kube-rbac-proxy-sar-config\"" Apr 16 22:46:02.162942 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.162926 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:46:02.166926 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.166905 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:02.250354 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.250312 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.250534 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.250390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.351371 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.351337 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.351586 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.351397 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.351586 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:46:02.351518 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-a5b4b-serving-cert: secret "splitter-graph-a5b4b-serving-cert" not found Apr 16 22:46:02.351699 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:46:02.351592 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls podName:ad3c132c-3f7b-4d23-901d-26778afdd207 nodeName:}" failed. No retries permitted until 2026-04-16 22:46:02.851570871 +0000 UTC m=+1936.397312280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls") pod "splitter-graph-a5b4b-65dbb98bf6-l4f6p" (UID: "ad3c132c-3f7b-4d23-901d-26778afdd207") : secret "splitter-graph-a5b4b-serving-cert" not found Apr 16 22:46:02.352000 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.351980 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.856587 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.856548 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:02.858907 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:02.858888 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") pod \"splitter-graph-a5b4b-65dbb98bf6-l4f6p\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:03.072477 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.072423 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:03.192011 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.191985 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:03.193670 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:46:03.193641 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3c132c_3f7b_4d23_901d_26778afdd207.slice/crio-797b40e57a9ecc4674722e8dda9cc03e87abd3260b8e5e0709f19183ad478cb7 WatchSource:0}: Error finding container 797b40e57a9ecc4674722e8dda9cc03e87abd3260b8e5e0709f19183ad478cb7: Status 404 returned error can't find the container with id 797b40e57a9ecc4674722e8dda9cc03e87abd3260b8e5e0709f19183ad478cb7 Apr 16 22:46:03.195532 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.195507 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:46:03.470761 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.470723 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" event={"ID":"ad3c132c-3f7b-4d23-901d-26778afdd207","Type":"ContainerStarted","Data":"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023"} Apr 16 22:46:03.470961 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.470771 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:03.470961 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.470785 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" event={"ID":"ad3c132c-3f7b-4d23-901d-26778afdd207","Type":"ContainerStarted","Data":"797b40e57a9ecc4674722e8dda9cc03e87abd3260b8e5e0709f19183ad478cb7"} Apr 16 22:46:03.487636 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:03.487595 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podStartSLOduration=1.487581349 podStartE2EDuration="1.487581349s" podCreationTimestamp="2026-04-16 22:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:46:03.485934806 +0000 UTC m=+1937.031676234" watchObservedRunningTime="2026-04-16 22:46:03.487581349 +0000 UTC m=+1937.033322775" Apr 16 22:46:09.479826 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:09.479796 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:12.212679 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:12.212645 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:12.213061 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:12.212896 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" containerID="cri-o://935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023" gracePeriod=30 Apr 16 22:46:14.478027 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:14.477984 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:19.477880 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:19.477840 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:24.478136 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:24.478089 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:24.478521 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:24.478190 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:27.938340 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:27.938302 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 22:46:27.941763 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:27.941745 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:27.944023 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:27.944001 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a9d3d-kube-rbac-proxy-sar-config\"" Apr 16 22:46:27.944129 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:27.944010 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-a9d3d-serving-cert\"" Apr 16 22:46:27.952353 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:27.952329 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 22:46:28.056063 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.056024 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.056232 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.056132 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.157578 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.157536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.157770 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.157596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.157770 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:46:28.157729 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-a9d3d-serving-cert: secret "switch-graph-a9d3d-serving-cert" not found Apr 16 22:46:28.157884 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:46:28.157786 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls podName:4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507 nodeName:}" failed. No retries permitted until 2026-04-16 22:46:28.657769418 +0000 UTC m=+1962.203510822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls") pod "switch-graph-a9d3d-89447d64c-pq54k" (UID: "4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507") : secret "switch-graph-a9d3d-serving-cert" not found Apr 16 22:46:28.158267 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.158241 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.661567 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.661532 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.663959 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.663940 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") pod \"switch-graph-a9d3d-89447d64c-pq54k\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.852704 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.852669 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:28.973095 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:28.973070 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 22:46:28.975588 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:46:28.975553 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd8d588_6ada_4f33_9a7a_7e8d2bd1e507.slice/crio-542edea5837b66b92200ca237176a3d602f91666da414bcd7e44c2a97024b25e WatchSource:0}: Error finding container 542edea5837b66b92200ca237176a3d602f91666da414bcd7e44c2a97024b25e: Status 404 returned error can't find the container with id 542edea5837b66b92200ca237176a3d602f91666da414bcd7e44c2a97024b25e Apr 16 22:46:29.478439 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:29.478385 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:29.567316 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:29.567281 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" event={"ID":"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507","Type":"ContainerStarted","Data":"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30"} Apr 16 22:46:29.567316 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:29.567317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" event={"ID":"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507","Type":"ContainerStarted","Data":"542edea5837b66b92200ca237176a3d602f91666da414bcd7e44c2a97024b25e"} Apr 16 22:46:29.567556 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:29.567464 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:29.585032 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:29.584983 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podStartSLOduration=2.584967029 podStartE2EDuration="2.584967029s" podCreationTimestamp="2026-04-16 22:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:46:29.582892922 +0000 UTC m=+1963.128634352" watchObservedRunningTime="2026-04-16 22:46:29.584967029 +0000 UTC m=+1963.130708517" Apr 16 22:46:34.477982 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:34.477893 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:35.576724 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:35.576694 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 22:46:39.477596 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:39.477548 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:46:42.356104 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.356078 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:42.473675 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.473576 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle\") pod \"ad3c132c-3f7b-4d23-901d-26778afdd207\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " Apr 16 22:46:42.473848 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.473690 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") pod \"ad3c132c-3f7b-4d23-901d-26778afdd207\" (UID: \"ad3c132c-3f7b-4d23-901d-26778afdd207\") " Apr 16 22:46:42.474011 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.473982 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "ad3c132c-3f7b-4d23-901d-26778afdd207" (UID: "ad3c132c-3f7b-4d23-901d-26778afdd207"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:46:42.475702 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.475678 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ad3c132c-3f7b-4d23-901d-26778afdd207" (UID: "ad3c132c-3f7b-4d23-901d-26778afdd207"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:46:42.574781 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.574743 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ad3c132c-3f7b-4d23-901d-26778afdd207-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:46:42.574781 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.574774 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad3c132c-3f7b-4d23-901d-26778afdd207-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:46:42.612992 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.612957 2565 generic.go:358] "Generic (PLEG): container finished" podID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerID="935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023" exitCode=0 Apr 16 22:46:42.613139 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.613027 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" Apr 16 22:46:42.613139 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.613044 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" event={"ID":"ad3c132c-3f7b-4d23-901d-26778afdd207","Type":"ContainerDied","Data":"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023"} Apr 16 22:46:42.613139 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.613086 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p" event={"ID":"ad3c132c-3f7b-4d23-901d-26778afdd207","Type":"ContainerDied","Data":"797b40e57a9ecc4674722e8dda9cc03e87abd3260b8e5e0709f19183ad478cb7"} Apr 16 22:46:42.613139 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.613101 2565 scope.go:117] "RemoveContainer" containerID="935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023" Apr 16 22:46:42.621393 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.621371 2565 scope.go:117] "RemoveContainer" containerID="935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023" Apr 16 22:46:42.621668 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:46:42.621649 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023\": container with ID starting with 935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023 not found: ID does not exist" containerID="935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023" Apr 16 22:46:42.621716 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.621677 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023"} err="failed to get container status \"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023\": rpc error: code = NotFound desc = could not find container \"935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023\": container with ID starting with 935696477d16c55616cce2faa92426c15ddf14ef502ce2dfeb2f993c37c27023 not found: ID does not exist" Apr 16 22:46:42.632648 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.632623 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:42.636006 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:42.635984 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a5b4b-65dbb98bf6-l4f6p"] Apr 16 22:46:43.018885 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:46:43.018843 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" path="/var/lib/kubelet/pods/ad3c132c-3f7b-4d23-901d-26778afdd207/volumes" Apr 16 22:47:12.417202 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.417161 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:47:12.417776 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.417756 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" Apr 16 22:47:12.417868 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.417780 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" Apr 16 22:47:12.417868 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.417853 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad3c132c-3f7b-4d23-901d-26778afdd207" containerName="splitter-graph-a5b4b" Apr 16 22:47:12.420828 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.420802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:12.422860 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.422838 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-3d312-kube-rbac-proxy-sar-config\"" Apr 16 22:47:12.422966 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.422879 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-3d312-serving-cert\"" Apr 16 22:47:12.428620 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.428599 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:47:12.534080 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.534036 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:12.534242 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.534103 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:12.635105 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.635074 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:12.635105 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.635114 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:12.635340 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:47:12.635239 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-3d312-serving-cert: secret "splitter-graph-3d312-serving-cert" not found Apr 16 22:47:12.635340 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:47:12.635316 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls podName:151758fa-1374-4c8c-9bf8-8a6e6278238c nodeName:}" failed. No retries permitted until 2026-04-16 22:47:13.135296655 +0000 UTC m=+2006.681038062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls") pod "splitter-graph-3d312-85f4c6c6c5-f8b9n" (UID: "151758fa-1374-4c8c-9bf8-8a6e6278238c") : secret "splitter-graph-3d312-serving-cert" not found Apr 16 22:47:12.635754 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:12.635735 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:13.139762 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.139713 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:13.142193 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.142169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") pod \"splitter-graph-3d312-85f4c6c6c5-f8b9n\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:13.332895 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.332851 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:13.451419 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.451382 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:47:13.453883 ip-10-0-141-169 kubenswrapper[2565]: W0416 22:47:13.453857 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151758fa_1374_4c8c_9bf8_8a6e6278238c.slice/crio-73b926bf487c2ebf2aafb1fd65d7cbfde887504932d446351b35466110564b98 WatchSource:0}: Error finding container 73b926bf487c2ebf2aafb1fd65d7cbfde887504932d446351b35466110564b98: Status 404 returned error can't find the container with id 73b926bf487c2ebf2aafb1fd65d7cbfde887504932d446351b35466110564b98 Apr 16 22:47:13.714252 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.714209 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" event={"ID":"151758fa-1374-4c8c-9bf8-8a6e6278238c","Type":"ContainerStarted","Data":"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e"} Apr 16 22:47:13.714252 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.714245 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" event={"ID":"151758fa-1374-4c8c-9bf8-8a6e6278238c","Type":"ContainerStarted","Data":"73b926bf487c2ebf2aafb1fd65d7cbfde887504932d446351b35466110564b98"} Apr 16 22:47:13.714478 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.714348 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:47:13.730153 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:13.730109 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podStartSLOduration=1.730097141 podStartE2EDuration="1.730097141s" podCreationTimestamp="2026-04-16 22:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:47:13.728112447 +0000 UTC m=+2007.273853877" watchObservedRunningTime="2026-04-16 22:47:13.730097141 +0000 UTC m=+2007.275838567" Apr 16 22:47:19.723053 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:47:19.723026 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:55:27.060098 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:27.060063 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:55:27.062641 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:27.060371 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" containerID="cri-o://fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e" gracePeriod=30 Apr 16 22:55:29.721716 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:29.721671 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:34.721705 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:34.721613 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:39.721969 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:39.721929 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:39.722350 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:39.722037 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:55:44.721296 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:44.721253 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:49.721307 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:49.721261 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:54.721924 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:54.721884 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:55:57.215573 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.215544 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:55:57.337323 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.337228 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") pod \"151758fa-1374-4c8c-9bf8-8a6e6278238c\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " Apr 16 22:55:57.337323 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.337298 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle\") pod \"151758fa-1374-4c8c-9bf8-8a6e6278238c\" (UID: \"151758fa-1374-4c8c-9bf8-8a6e6278238c\") " Apr 16 22:55:57.337703 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.337676 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "151758fa-1374-4c8c-9bf8-8a6e6278238c" (UID: "151758fa-1374-4c8c-9bf8-8a6e6278238c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:55:57.339435 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.339392 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "151758fa-1374-4c8c-9bf8-8a6e6278238c" (UID: "151758fa-1374-4c8c-9bf8-8a6e6278238c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:55:57.438024 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.437982 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/151758fa-1374-4c8c-9bf8-8a6e6278238c-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:55:57.438024 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.438011 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/151758fa-1374-4c8c-9bf8-8a6e6278238c-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 22:55:57.455403 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.455364 2565 generic.go:358] "Generic (PLEG): container finished" podID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerID="fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e" exitCode=0 Apr 16 22:55:57.455588 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.455455 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" Apr 16 22:55:57.455588 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.455453 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" event={"ID":"151758fa-1374-4c8c-9bf8-8a6e6278238c","Type":"ContainerDied","Data":"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e"} Apr 16 22:55:57.455588 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.455497 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n" event={"ID":"151758fa-1374-4c8c-9bf8-8a6e6278238c","Type":"ContainerDied","Data":"73b926bf487c2ebf2aafb1fd65d7cbfde887504932d446351b35466110564b98"} Apr 16 22:55:57.455588 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.455514 2565 scope.go:117] "RemoveContainer" containerID="fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e" Apr 16 22:55:57.464188 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.464168 2565 scope.go:117] "RemoveContainer" containerID="fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e" Apr 16 22:55:57.464491 ip-10-0-141-169 kubenswrapper[2565]: E0416 22:55:57.464471 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e\": container with ID starting with fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e not found: ID does not exist" containerID="fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e" Apr 16 22:55:57.464552 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.464498 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e"} err="failed to get container status \"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e\": rpc error: code = NotFound desc = could not find container \"fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e\": container with ID starting with fcd44462fe4f6db8e0fc9bca2e8eeea7cc245225dd3f74d92fe8dbb4d742a66e not found: ID does not exist" Apr 16 22:55:57.475097 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.475067 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:55:57.478742 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:57.478717 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3d312-85f4c6c6c5-f8b9n"] Apr 16 22:55:59.018645 ip-10-0-141-169 kubenswrapper[2565]: I0416 22:55:59.018608 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" path="/var/lib/kubelet/pods/151758fa-1374-4c8c-9bf8-8a6e6278238c/volumes" Apr 16 23:02:47.259754 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:02:47.259722 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 23:02:47.262092 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:02:47.259944 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" containerID="cri-o://5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30" gracePeriod=30 Apr 16 23:02:50.575845 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:02:50.575807 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:02:55.575522 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:02:55.575477 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:03:00.575734 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:00.575642 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:03:00.576165 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:00.575756 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 23:03:01.741170 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:01.741131 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:02.537010 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:02.536976 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:03.374875 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:03.374848 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:04.156329 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:04.156300 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:04.909241 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:04.909211 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:05.575519 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:05.575475 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:03:05.688560 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:05.688526 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:06.451273 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:06.451243 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:07.254451 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:07.254420 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:08.009427 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:08.009387 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:08.784463 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:08.784434 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:09.592771 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:09.592740 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:10.416545 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:10.416515 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-a9d3d-89447d64c-pq54k_4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/switch-graph-a9d3d/0.log" Apr 16 23:03:10.575739 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:10.575702 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:03:15.224694 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:15.224662 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dtz9c_d29e3c30-fd45-44b0-9689-dac93ee43db6/global-pull-secret-syncer/0.log" Apr 16 23:03:15.305205 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:15.305173 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dt42x_db66a2b0-751f-4d43-8765-6ecfb4ef22eb/konnectivity-agent/0.log" Apr 16 23:03:15.399826 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:15.399792 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-169.ec2.internal_a0fd34b83b170bf49f4fc19c252e8bbc/haproxy/0.log" Apr 16 23:03:15.576101 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:15.576008 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:03:17.396106 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.396083 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 23:03:17.397329 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.397308 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") pod \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " Apr 16 23:03:17.397458 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.397365 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle\") pod \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\" (UID: \"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507\") " Apr 16 23:03:17.397690 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.397670 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" (UID: "4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:03:17.399361 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.399339 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" (UID: "4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:03:17.498002 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.497963 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-proxy-tls\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 23:03:17.498002 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.497993 2565 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507-openshift-service-ca-bundle\") on node \"ip-10-0-141-169.ec2.internal\" DevicePath \"\"" Apr 16 23:03:17.899994 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.899956 2565 generic.go:358] "Generic (PLEG): container finished" podID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerID="5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30" exitCode=0 Apr 16 23:03:17.900148 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.900022 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" Apr 16 23:03:17.900148 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.900025 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" event={"ID":"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507","Type":"ContainerDied","Data":"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30"} Apr 16 23:03:17.900148 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.900056 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k" event={"ID":"4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507","Type":"ContainerDied","Data":"542edea5837b66b92200ca237176a3d602f91666da414bcd7e44c2a97024b25e"} Apr 16 23:03:17.900148 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.900072 2565 scope.go:117] "RemoveContainer" containerID="5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30" Apr 16 23:03:17.914946 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.914924 2565 scope.go:117] "RemoveContainer" containerID="5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30" Apr 16 23:03:17.915209 ip-10-0-141-169 kubenswrapper[2565]: E0416 23:03:17.915190 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30\": container with ID starting with 5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30 not found: ID does not exist" containerID="5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30" Apr 16 23:03:17.915258 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.915218 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30"} err="failed to get container status \"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30\": rpc error: code = NotFound desc = could not find container \"5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30\": container with ID starting with 5586ff1a62bf6e1717c79ecea2b583eafff2c806b541cfcdb94e1c568c3e6b30 not found: ID does not exist" Apr 16 23:03:17.923735 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.923710 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 23:03:17.927731 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:17.927711 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-a9d3d-89447d64c-pq54k"] Apr 16 23:03:18.996600 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:18.996570 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/alertmanager/0.log" Apr 16 23:03:19.019163 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.019128 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" path="/var/lib/kubelet/pods/4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507/volumes" Apr 16 23:03:19.020344 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.020321 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/config-reloader/0.log" Apr 16 23:03:19.042275 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.042247 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/kube-rbac-proxy-web/0.log" Apr 16 23:03:19.064081 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.064055 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/kube-rbac-proxy/0.log" Apr 16 23:03:19.086996 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.086969 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/kube-rbac-proxy-metric/0.log" Apr 16 23:03:19.111057 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.110993 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/prom-label-proxy/0.log" Apr 16 23:03:19.135782 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.135759 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5563b85d-55cd-4297-b5e3-6cdfeaed90f0/init-config-reloader/0.log" Apr 16 23:03:19.178212 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.178183 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-p52zz_5a0b0ab0-57d0-4640-93b0-5859a8af3aa5/cluster-monitoring-operator/0.log" Apr 16 23:03:19.199299 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.199269 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvw5s_36aa52bd-69d4-437f-9e73-2f05a7ae660e/kube-state-metrics/0.log" Apr 16 23:03:19.219491 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.219463 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvw5s_36aa52bd-69d4-437f-9e73-2f05a7ae660e/kube-rbac-proxy-main/0.log" Apr 16 23:03:19.244361 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.244336 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xvw5s_36aa52bd-69d4-437f-9e73-2f05a7ae660e/kube-rbac-proxy-self/0.log" Apr 16 23:03:19.298290 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.298263 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-kf5c5_d72fa396-b4b8-4dc3-995e-8f06c575edb7/monitoring-plugin/0.log" Apr 16 23:03:19.404875 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.404801 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntpqd_062d7229-a013-4380-ab02-82ecd7a903da/node-exporter/0.log" Apr 16 23:03:19.425719 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.425691 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntpqd_062d7229-a013-4380-ab02-82ecd7a903da/kube-rbac-proxy/0.log" Apr 16 23:03:19.445661 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.445637 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ntpqd_062d7229-a013-4380-ab02-82ecd7a903da/init-textfile/0.log" Apr 16 23:03:19.787738 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.787709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-56clp_e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a/prometheus-operator/0.log" Apr 16 23:03:19.812937 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.812911 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-56clp_e7dc2dad-9fd4-4d53-a31b-c7979e4fd34a/kube-rbac-proxy/0.log" Apr 16 23:03:19.868100 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.868074 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794b697c49-nqs59_680e304d-24fc-4033-a171-3051d17c3af2/telemeter-client/0.log" Apr 16 23:03:19.890087 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.890062 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794b697c49-nqs59_680e304d-24fc-4033-a171-3051d17c3af2/reload/0.log" Apr 16 23:03:19.910097 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.910073 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794b697c49-nqs59_680e304d-24fc-4033-a171-3051d17c3af2/kube-rbac-proxy/0.log" Apr 16 23:03:19.940668 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.940638 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/thanos-query/0.log" Apr 16 23:03:19.961768 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.961742 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/kube-rbac-proxy-web/0.log" Apr 16 23:03:19.983543 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:19.983516 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/kube-rbac-proxy/0.log" Apr 16 23:03:20.009948 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:20.009923 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/prom-label-proxy/0.log" Apr 16 23:03:20.039426 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:20.039336 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/kube-rbac-proxy-rules/0.log" Apr 16 23:03:20.063330 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:20.063304 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85f799bc7d-kk5vv_3f43fe85-9d27-4e45-b966-2c0ce4388b28/kube-rbac-proxy-metrics/0.log" Apr 16 23:03:22.561087 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561053 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp"] Apr 16 23:03:22.561581 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561557 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" Apr 16 23:03:22.561581 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561577 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" Apr 16 23:03:22.561706 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561601 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" Apr 16 23:03:22.561706 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561611 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" Apr 16 23:03:22.561800 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561711 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4dd8d588-6ada-4f33-9a7a-7e8d2bd1e507" containerName="switch-graph-a9d3d" Apr 16 23:03:22.561800 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.561731 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="151758fa-1374-4c8c-9bf8-8a6e6278238c" containerName="splitter-graph-3d312" Apr 16 23:03:22.564870 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.564851 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.567352 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.567331 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"kube-root-ca.crt\"" Apr 16 23:03:22.567469 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.567333 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vzm9p\"/\"default-dockercfg-bgrbr\"" Apr 16 23:03:22.568015 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.568002 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzm9p\"/\"openshift-service-ca.crt\"" Apr 16 23:03:22.571829 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.571573 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp"] Apr 16 23:03:22.639583 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.639541 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-sys\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.639747 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.639647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-proc\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.639747 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.639670 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-lib-modules\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.639747 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.639694 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-podres\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.639747 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.639718 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96qr\" (UniqueName: \"kubernetes.io/projected/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-kube-api-access-j96qr\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740373 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740334 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-proc\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740373 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740373 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-lib-modules\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740565 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740395 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-podres\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740565 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740480 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-proc\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740565 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740498 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j96qr\" (UniqueName: \"kubernetes.io/projected/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-kube-api-access-j96qr\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740565 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740540 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-podres\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740565 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740550 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-lib-modules\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740724 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740583 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-sys\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.740724 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.740653 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-sys\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.748683 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.748666 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96qr\" (UniqueName: \"kubernetes.io/projected/2fb8336a-8abd-4343-8c7f-a4b59b60e18f-kube-api-access-j96qr\") pod \"perf-node-gather-daemonset-7j7bp\" (UID: \"2fb8336a-8abd-4343-8c7f-a4b59b60e18f\") " pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.875443 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.875337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:22.996750 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.996721 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp"] Apr 16 23:03:22.996826 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:22.996805 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-86xzj_94b66058-cae9-48ec-b576-71611c7b606e/dns/0.log" Apr 16 23:03:22.998865 ip-10-0-141-169 kubenswrapper[2565]: W0416 23:03:22.998825 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2fb8336a_8abd_4343_8c7f_a4b59b60e18f.slice/crio-d2f567b02da4082e18f2351a654581be656b65b14a8543a1c410478a0ecd5430 WatchSource:0}: Error finding container d2f567b02da4082e18f2351a654581be656b65b14a8543a1c410478a0ecd5430: Status 404 returned error can't find the container with id d2f567b02da4082e18f2351a654581be656b65b14a8543a1c410478a0ecd5430 Apr 16 23:03:23.000507 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.000484 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:03:23.024383 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.024346 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-86xzj_94b66058-cae9-48ec-b576-71611c7b606e/kube-rbac-proxy/0.log" Apr 16 23:03:23.181229 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.181196 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tsbd9_969daa2e-581d-4248-8104-2e50544de6b9/dns-node-resolver/0.log" Apr 16 23:03:23.613656 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.613628 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wpfh2_2327dcdf-f40d-43bf-905a-1404d6e339f7/node-ca/0.log" Apr 16 23:03:23.920450 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.920400 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" event={"ID":"2fb8336a-8abd-4343-8c7f-a4b59b60e18f","Type":"ContainerStarted","Data":"f78561734f0bc9b68b3b9f4ea4a4670b9801afc9ed6030f3fc61891161251cb2"} Apr 16 23:03:23.920618 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.920458 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" event={"ID":"2fb8336a-8abd-4343-8c7f-a4b59b60e18f","Type":"ContainerStarted","Data":"d2f567b02da4082e18f2351a654581be656b65b14a8543a1c410478a0ecd5430"} Apr 16 23:03:23.920618 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.920552 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:23.935255 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:23.935212 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" podStartSLOduration=1.935197109 podStartE2EDuration="1.935197109s" podCreationTimestamp="2026-04-16 23:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:03:23.933872991 +0000 UTC m=+2977.479614417" watchObservedRunningTime="2026-04-16 23:03:23.935197109 +0000 UTC m=+2977.480938527" Apr 16 23:03:24.299373 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:24.299298 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68f7bcf4cd-4j647_f1f1b4a3-4e97-42c1-8e7a-224c24e6fa07/router/0.log" Apr 16 23:03:24.657345 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:24.657273 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gxxg4_e4a447af-7f68-4189-bc97-af653fe8ba76/serve-healthcheck-canary/0.log" Apr 16 23:03:25.232206 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:25.232172 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tdvzz_0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a/kube-rbac-proxy/0.log" Apr 16 23:03:25.252865 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:25.252841 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tdvzz_0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a/exporter/0.log" Apr 16 23:03:25.272831 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:25.272805 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tdvzz_0de9bb94-5ef7-41cb-8fa8-5ddbc1aafa5a/extractor/0.log" Apr 16 23:03:29.933881 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:29.933848 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vzm9p/perf-node-gather-daemonset-7j7bp" Apr 16 23:03:32.828840 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.828811 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/kube-multus-additional-cni-plugins/0.log" Apr 16 23:03:32.849331 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.849302 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/egress-router-binary-copy/0.log" Apr 16 23:03:32.869125 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.869100 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/cni-plugins/0.log" Apr 16 23:03:32.891113 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.891079 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/bond-cni-plugin/0.log" Apr 16 23:03:32.912104 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.912080 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/routeoverride-cni/0.log" Apr 16 23:03:32.933348 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.933316 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/whereabouts-cni-bincopy/0.log" Apr 16 23:03:32.954318 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:32.954291 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8xzc6_122992fa-4992-414a-8573-d77e9afd6b29/whereabouts-cni/0.log" Apr 16 23:03:33.202864 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:33.202828 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgsmj_33e6dd78-7ff6-4e30-aa7d-6f6c4ac533d9/kube-multus/0.log" Apr 16 23:03:33.367808 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:33.367772 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qz5vc_e06e94b1-2063-48f2-b8a7-0d0e4193f064/network-metrics-daemon/0.log" Apr 16 23:03:33.387483 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:33.387454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qz5vc_e06e94b1-2063-48f2-b8a7-0d0e4193f064/kube-rbac-proxy/0.log" Apr 16 23:03:34.456873 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.456840 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/ovn-controller/0.log" Apr 16 23:03:34.487063 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.487031 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/ovn-acl-logging/0.log" Apr 16 23:03:34.508998 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.508971 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/kube-rbac-proxy-node/0.log" Apr 16 23:03:34.532040 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.532010 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 23:03:34.561741 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.561713 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/northd/0.log" Apr 16 23:03:34.581363 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.581336 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/nbdb/0.log" Apr 16 23:03:34.601026 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.601005 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/sbdb/0.log" Apr 16 23:03:34.693600 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:34.693563 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp6wl_71427d26-ce68-4714-9505-292c288a5fdf/ovnkube-controller/0.log" Apr 16 23:03:35.965329 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:35.965302 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bdcch_0b5f6846-8363-4956-b563-df34509912b0/network-check-target-container/0.log" Apr 16 23:03:36.801898 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:36.801865 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n5bvs_3a701b0d-e8cf-4242-a773-67ba88c764b5/iptables-alerter/0.log" Apr 16 23:03:37.388248 ip-10-0-141-169 kubenswrapper[2565]: I0416 23:03:37.388220 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dg4lb_e1bdc782-5278-4724-89af-c7bf4325aea4/tuned/0.log"