Apr 22 17:32:12.880530 ip-10-0-132-165 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:32:12.880544 ip-10-0-132-165 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:32:12.880553 ip-10-0-132-165 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:32:12.880862 ip-10-0-132-165 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:32:23.106398 ip-10-0-132-165 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:32:23.106420 ip-10-0-132-165 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 46a9fbd15bba4bcdbd1f2de46661b190 -- Apr 22 17:34:51.123152 ip-10-0-132-165 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:51.590510 ip-10-0-132-165 kubenswrapper[2539]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:51.590510 ip-10-0-132-165 kubenswrapper[2539]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:51.590510 ip-10-0-132-165 kubenswrapper[2539]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:51.590510 ip-10-0-132-165 kubenswrapper[2539]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:51.590510 ip-10-0-132-165 kubenswrapper[2539]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:51.593277 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.593206 2539 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:51.600722 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600707 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:51.600722 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600722 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600726 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600732 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600736 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600739 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600742 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600745 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600748 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600751 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600754 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600756 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600759 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600762 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600765 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600768 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600771 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600773 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600776 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600778 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600782 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:51.600787 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600784 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600787 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600789 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600792 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600795 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600798 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600800 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600803 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600806 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600808 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600811 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600813 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600816 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600818 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600821 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600824 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600827 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600830 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600833 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600835 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:51.601270 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600838 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600854 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600858 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600861 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600865 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600868 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600873 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600876 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600879 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600881 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600884 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600887 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600889 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600892 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600895 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600907 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600910 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600913 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600918 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:51.601747 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600921 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600924 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600926 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600929 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600931 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600934 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600937 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600940 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600942 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600946 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600948 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600951 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600954 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600957 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600959 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600962 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600964 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600967 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600969 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600972 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:51.602258 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600976 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600978 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600981 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600984 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600986 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.600989 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601360 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601366 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601369 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601372 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601375 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601378 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601380 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601384 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601386 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601389 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601391 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601394 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601396 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:51.602757 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601399 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601401 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601405 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601408 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601410 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601413 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601415 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601418 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601420 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601422 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601425 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601428 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601430 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601433 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601436 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601440 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601443 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601446 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601449 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:51.603236 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601452 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601456 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601458 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601461 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601464 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601466 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601469 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601471 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601474 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601476 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601479 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601482 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601486 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601489 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601491 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601494 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601497 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601499 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601502 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601504 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:51.603709 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601507 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601510 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601512 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601515 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601517 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601520 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601523 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601526 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601528 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601530 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601533 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601535 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601537 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601540 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601543 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601545 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601548 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601550 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601553 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601555 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:51.604226 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601558 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601561 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601563 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601566 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601568 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601570 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601573 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601575 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601578 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601581 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601583 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601585 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601588 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.601590 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602368 2539 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602377 2539 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602384 2539 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602389 2539 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602393 2539 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602397 2539 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602401 2539 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:51.604714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602414 2539 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602418 2539 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602421 2539 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602424 2539 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602428 2539 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602431 2539 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602434 2539 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602437 2539 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602440 2539 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602443 2539 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602446 2539 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602449 2539 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602453 2539 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602456 2539 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602459 2539 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602461 2539 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602465 2539 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602469 2539 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602472 2539 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602475 2539 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602478 2539 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602482 2539 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602485 2539 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602488 2539 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602492 2539 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:51.605283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602495 2539 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602499 2539 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602502 2539 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602505 2539 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602508 2539 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602511 2539 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602514 2539 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602519 2539 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602522 2539 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602525 2539 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602528 2539 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602531 2539 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602535 2539 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602539 2539 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602542 2539 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602545 2539 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602548 2539 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602551 2539 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602554 2539 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602557 2539 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602560 2539 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602563 2539 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602565 2539 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602569 2539 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602572 2539 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:51.605911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602575 2539 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602578 2539 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602581 2539 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602584 2539 flags.go:64] FLAG: --help="false" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602587 2539 flags.go:64] FLAG: --hostname-override="ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602591 2539 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602594 2539 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602599 2539 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602602 2539 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602605 2539 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602608 2539 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602611 2539 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602614 2539 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602617 2539 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602620 2539 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602623 2539 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602626 2539 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602629 2539 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602632 2539 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602635 2539 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602638 2539 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602641 2539 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602644 2539 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602647 2539 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:51.606530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602650 2539 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602655 2539 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602658 2539 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602661 2539 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602664 2539 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602666 2539 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602670 2539 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602673 2539 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602675 2539 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602679 2539 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602682 2539 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602686 2539 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602689 2539 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602698 2539 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602701 2539 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602706 2539 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602709 2539 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602712 2539 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602715 2539 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602727 2539 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602731 2539 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602734 2539 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602737 2539 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:51.607115 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602740 2539 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602746 2539 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602748 2539 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602752 2539 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602754 2539 flags.go:64] FLAG: --port="10250" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602757 2539 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602760 2539 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04819980097529350" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602763 2539 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602767 2539 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602770 2539 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602773 2539 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602776 2539 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602784 2539 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602787 2539 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602790 2539 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602793 2539 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602796 2539 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602799 2539 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602802 2539 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602805 2539 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602808 2539 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602811 2539 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602815 2539 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602818 2539 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602821 2539 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602826 2539 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:51.607682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602829 2539 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602832 2539 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602835 2539 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602838 2539 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602841 2539 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602844 2539 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602846 2539 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602849 2539 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602852 2539 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602855 2539 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602861 2539 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602863 2539 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602866 2539 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602869 2539 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602872 2539 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602875 2539 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602878 2539 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602881 2539 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602884 2539 flags.go:64] FLAG: --v="2" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602888 2539 flags.go:64] FLAG: --version="false" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602892 2539 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602896 2539 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.602910 2539 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603001 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:51.608313 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603004 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603009 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603012 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603015 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603017 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603020 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603023 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603027 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603029 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603032 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603034 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603037 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603040 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603042 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603045 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603047 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603050 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603053 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603055 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:51.608888 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603058 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603060 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603063 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603066 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603068 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603071 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603073 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603077 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603081 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603084 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603087 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603089 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603091 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603094 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603098 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603100 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603103 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603105 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603108 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603111 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:51.609466 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603115 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603118 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603120 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603123 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603126 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603128 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603131 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603133 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603136 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603138 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603141 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603143 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603146 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603149 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603151 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603154 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603156 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603159 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603161 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603164 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:51.610241 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603166 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603169 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603171 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603174 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603176 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603179 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603185 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603188 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603192 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603195 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603197 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603200 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603204 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603207 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603210 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603212 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603215 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603217 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603220 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603222 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:51.610795 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603225 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603228 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603230 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603233 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603235 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.603238 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:51.611411 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.604078 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:51.612460 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.612442 2539 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:51.612499 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.612461 2539 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:51.612527 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612509 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:51.612527 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612515 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:51.612527 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612520 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:51.612527 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612524 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:51.612527 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612527 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612530 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612533 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612537 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612540 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612543 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612546 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612549 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612551 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612555 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612558 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612560 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612563 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612566 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612568 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612571 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612574 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612576 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612579 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612582 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:51.612650 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612584 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612587 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612590 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612592 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612595 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612597 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612600 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612602 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612605 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612607 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612610 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612613 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612615 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612618 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612621 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612624 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612627 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612630 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612632 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612635 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:51.613165 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612638 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612640 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612643 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612645 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612648 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612651 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612653 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612656 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612659 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612661 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612665 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612668 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612670 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612673 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612677 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612680 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612683 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612686 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612688 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612691 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:51.613649 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612693 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612696 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612699 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612701 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612704 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612706 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612710 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612712 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612715 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612718 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612720 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612723 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612726 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612729 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612731 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612734 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612736 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612739 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612741 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612744 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:51.614150 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612746 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612749 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.612754 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612849 2539 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612854 2539 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612857 2539 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612860 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612863 2539 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612866 2539 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612868 2539 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612871 2539 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612874 2539 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612877 2539 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612879 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612882 2539 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:51.614659 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612885 2539 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612887 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612890 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612893 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612896 2539 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612917 2539 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612922 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612925 2539 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612928 2539 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612931 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612934 2539 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612937 2539 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612939 2539 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612942 2539 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612944 2539 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612947 2539 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612949 2539 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612952 2539 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612954 2539 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:51.615035 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612957 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612959 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612962 2539 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612964 2539 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612967 2539 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612969 2539 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612972 2539 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612974 2539 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612977 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612979 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612981 2539 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612984 2539 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612986 2539 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612989 2539 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612992 2539 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612994 2539 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612997 2539 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.612999 2539 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613002 2539 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613004 2539 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:51.615520 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613008 2539 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613011 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613014 2539 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613017 2539 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613020 2539 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613023 2539 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613026 2539 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613028 2539 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613031 2539 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613033 2539 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613036 2539 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613038 2539 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613041 2539 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613044 2539 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613046 2539 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613049 2539 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613051 2539 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613053 2539 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613056 2539 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613059 2539 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:51.616047 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613061 2539 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613064 2539 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613066 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613069 2539 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613071 2539 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613073 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613076 2539 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613078 2539 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613081 2539 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613084 2539 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613086 2539 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613089 2539 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613091 2539 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613094 2539 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:51.613097 2539 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.613102 2539 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:51.616535 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.613915 2539 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:51.618041 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.618028 2539 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:51.619038 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.619026 2539 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:51.619144 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.619126 2539 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:51.619181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.619165 2539 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:51.647050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.647033 2539 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:51.649944 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.649925 2539 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:51.663493 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.663470 2539 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:51.669134 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.669120 2539 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:51.672462 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.672445 2539 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:51.678214 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.678186 2539 fs.go:135] Filesystem UUIDs: map[63279240-817c-49cd-8977-6333fa5da3fd:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 af9899fa-43a1-494c-8ef8-cb8300bdeadc:/dev/nvme0n1p3] Apr 22 17:34:51.678300 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.678208 2539 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:51.678995 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.678976 2539 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:51.684069 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.683952 2539 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:51.682009754 +0000 UTC m=+0.438094329 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098921 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fd8625a1c2616602fb7090dbe917e SystemUUID:ec2fd862-5a1c-2616-602f-b7090dbe917e BootID:46a9fbd1-5bba-4bcd-bd1f-2de46661b190 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:71:51:f3:45:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:71:51:f3:45:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:18:62:2a:56:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:51.684069 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.684057 2539 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:51.684229 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.684148 2539 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:51.685181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685154 2539 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:51.685343 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685183 2539 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-165.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:51.685428 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685356 2539 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:51.685428 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685369 2539 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:51.685428 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685387 2539 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:51.685428 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.685401 2539 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:51.686865 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.686851 2539 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:51.687044 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.687031 2539 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:51.689445 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.689433 2539 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:51.689511 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.689450 2539 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:51.689511 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.689466 2539 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:51.689511 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.689477 2539 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:51.689511 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.689497 2539 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:51.690631 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.690617 2539 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:51.690706 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.690641 2539 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:51.694328 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.694312 2539 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:51.695830 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.695818 2539 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:51.698998 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.698979 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699009 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699022 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699033 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699045 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699056 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:51.699068 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699067 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:51.699223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699078 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:51.699223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699091 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:51.699223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699102 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:51.699223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699130 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:51.699223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.699145 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:51.700933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.700921 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:51.700966 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.700934 2539 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:51.703281 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.703260 2539 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-165.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:51.703955 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.703930 2539 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-165.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:51.703955 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.703932 2539 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:51.704448 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704436 2539 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:51.704485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704473 2539 server.go:1295] "Started kubelet" Apr 22 17:34:51.704580 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704558 2539 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:51.704875 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704729 2539 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:51.704972 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704938 2539 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dtlc" Apr 22 17:34:51.705009 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.704983 2539 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:51.705240 ip-10-0-132-165 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:51.706134 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.706038 2539 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:51.706540 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.706528 2539 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:51.712521 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.712503 2539 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:51.713084 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.713065 2539 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:51.714948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.714136 2539 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:51.714948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.714162 2539 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:51.714948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.714171 2539 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:51.714948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.714335 2539 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:51.714948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.714343 2539 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:51.715210 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.715076 2539 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9dtlc" Apr 22 17:34:51.715657 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.715643 2539 factory.go:55] Registering systemd factory Apr 22 17:34:51.715727 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.715667 2539 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:51.715917 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.715877 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:51.716017 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.715926 2539 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:51.716017 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.715945 2539 factory.go:153] Registering CRI-O factory Apr 22 17:34:51.716017 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.715961 2539 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:51.716017 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.716013 2539 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:51.716181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.716040 2539 factory.go:103] Registering Raw factory Apr 22 17:34:51.716181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.716057 2539 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:51.716644 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.716632 2539 manager.go:319] Starting recovery of all containers Apr 22 17:34:51.717308 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.712094 2539 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-165.ec2.internal.18a8be4d2eca9463 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-165.ec2.internal,UID:ip-10-0-132-165.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-165.ec2.internal,},FirstTimestamp:2026-04-22 17:34:51.704448099 +0000 UTC m=+0.460532662,LastTimestamp:2026-04-22 17:34:51.704448099 +0000 UTC m=+0.460532662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-165.ec2.internal,}" Apr 22 17:34:51.724388 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.724365 2539 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:51.727851 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.727724 2539 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-165.ec2.internal\" not found" node="ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.727851 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.727793 2539 manager.go:324] Recovery completed Apr 22 17:34:51.729028 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.729006 2539 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 17:34:51.732265 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.732254 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.737597 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.737579 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.737667 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.737604 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.737667 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.737617 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.738100 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.738083 2539 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:51.738100 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.738098 2539 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:51.738210 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.738113 2539 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:51.740758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.740747 2539 policy_none.go:49] "None policy: Start" Apr 22 17:34:51.740794 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.740763 2539 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:51.740794 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.740774 2539 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778447 2539 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.778477 2539 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778488 2539 server.go:85] "Starting device plugin registration server" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778699 2539 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778709 2539 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778864 2539 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778987 2539 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.778996 2539 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.779438 2539 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:51.782385 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.779474 2539 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:51.839870 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.839845 2539 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:51.841022 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.840978 2539 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:51.841022 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.840999 2539 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:51.841022 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.841016 2539 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:51.841022 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.841022 2539 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:51.841211 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.841050 2539 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:51.844507 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.844490 2539 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:51.879539 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.879523 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.880308 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.880280 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.880308 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.880305 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.880418 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.880315 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.880418 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.880339 2539 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.888270 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.888251 2539 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.888318 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.888280 2539 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-165.ec2.internal\": node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:51.900796 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.900775 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:51.941911 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.941860 2539 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal"] Apr 22 17:34:51.941973 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.941966 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.944579 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.944558 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.944646 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.944585 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.944646 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.944597 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.945743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.945731 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.945894 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.945881 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.945945 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.945922 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.946806 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946791 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.946806 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946800 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.946933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946818 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.946933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946819 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.946933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946833 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.946933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.946840 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.948222 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.948210 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.948264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.948231 2539 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:51.949218 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.949206 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:51.949282 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.949229 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:51.949282 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:51.949241 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:51.962418 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.962391 2539 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-165.ec2.internal\" not found" node="ip-10-0-132-165.ec2.internal" Apr 22 17:34:51.965956 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:51.965943 2539 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-165.ec2.internal\" not found" node="ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.001178 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.001158 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.015823 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.015802 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.015886 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.015826 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.015886 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.015843 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136192d3e6d2f42c0abb843d3e675799-config\") pod \"kube-apiserver-proxy-ip-10-0-132-165.ec2.internal\" (UID: \"136192d3e6d2f42c0abb843d3e675799\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.101694 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.101646 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.116825 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116803 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.116877 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116869 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136192d3e6d2f42c0abb843d3e675799-config\") pod \"kube-apiserver-proxy-ip-10-0-132-165.ec2.internal\" (UID: \"136192d3e6d2f42c0abb843d3e675799\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.116935 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116884 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/136192d3e6d2f42c0abb843d3e675799-config\") pod \"kube-apiserver-proxy-ip-10-0-132-165.ec2.internal\" (UID: \"136192d3e6d2f42c0abb843d3e675799\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.116935 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116925 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.117002 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116969 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.117002 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.116932 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/40399284665a2832f575e85e3fe44faa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal\" (UID: \"40399284665a2832f575e85e3fe44faa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.201875 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.201853 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.264375 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.264347 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.268812 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.268791 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.302848 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.302822 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.403395 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.403349 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.503891 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.503871 2539 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-165.ec2.internal\" not found" Apr 22 17:34:52.575183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.575163 2539 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:52.613611 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.613595 2539 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.619570 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.619558 2539 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:52.619659 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.619645 2539 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:52.619704 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.619687 2539 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:52.619739 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.619693 2539 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:34:52.619739 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.619694 2539 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a19817953bb2f4218b67c61442f5c0d4-71ffac5d2d8d3f3d.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.132.165:51616->32.196.19.251:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.619739 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.619718 2539 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" Apr 22 17:34:52.637395 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.637374 2539 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:52.689843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.689824 2539 apiserver.go:52] "Watching apiserver" Apr 22 17:34:52.696239 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.696219 2539 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:52.698091 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.698070 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d4jd9","openshift-multus/network-metrics-daemon-8bxsz","openshift-cluster-node-tuning-operator/tuned-ljqnc","openshift-dns/node-resolver-dm5wr","openshift-network-diagnostics/network-check-target-rghzd","openshift-network-operator/iptables-alerter-b6m8h","openshift-ovn-kubernetes/ovnkube-node-dtf75","kube-system/konnectivity-agent-h2ghv","kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9","openshift-image-registry/node-ca-gxs6f","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal","openshift-multus/multus-additional-cni-plugins-6gk5m"] Apr 22 17:34:52.699586 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.699565 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.700577 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.700546 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.700663 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.700624 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:34:52.701616 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.701600 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.701936 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.701916 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:52.702014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.701920 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8n2vv\"" Apr 22 17:34:52.702067 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702019 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:52.702229 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702212 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.702290 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702217 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:52.702345 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702328 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.702434 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702398 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:52.702896 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.702881 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.703582 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.703562 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fnvgp\"" Apr 22 17:34:52.703666 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.703610 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.703855 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.703839 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.704388 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.704371 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:52.704472 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.704443 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:34:52.704813 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.704795 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.704888 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.704868 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ptgzc\"" Apr 22 17:34:52.704961 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.704868 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.707036 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.706129 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.708467 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.708446 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.708467 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.708467 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:52.709205 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.709187 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.710004 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.709987 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ntf6c\"" Apr 22 17:34:52.710075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.710022 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.710606 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.710587 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.711700 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.711681 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.712104 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.712068 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:52.712204 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.712159 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-t4d8l\"" Apr 22 17:34:52.712449 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.712369 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:52.712817 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.712799 2539 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:52.713020 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.713003 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.713747 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.713731 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xphvg\"" Apr 22 17:34:52.714096 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714073 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:52.714181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714078 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:52.714181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714140 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.714181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714111 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.714181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714176 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.714369 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714178 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:52.714509 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714494 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-58pbr\"" Apr 22 17:34:52.714564 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714515 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.714955 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.714940 2539 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:52.715366 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.715351 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.715422 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.715414 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:52.715655 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.715642 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9vmk9\"" Apr 22 17:34:52.718129 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718100 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:52.718214 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718134 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:52.718214 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718179 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:52.718298 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718213 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:52.718298 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718214 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:51 +0000 UTC" deadline="2027-11-17 10:41:25.067753845 +0000 UTC" Apr 22 17:34:52.718298 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.718237 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13769h6m32.34951977s" Apr 22 17:34:52.719318 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.719299 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9pgcf\"" Apr 22 17:34:52.720559 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720542 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba45db53-69e8-40c3-8090-98739842b87e-serviceca\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.720621 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720565 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv85b\" (UniqueName: \"kubernetes.io/projected/a69dead2-c622-4a13-a5b4-5367b68c10a8-kube-api-access-lv85b\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.720621 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720582 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysconfig\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.720621 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720600 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.720725 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720621 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-multus\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.720725 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720655 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-cni-binary-copy\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.720725 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720706 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-bin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720728 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720755 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/011dd50c-8a4c-425d-bb7b-86836dfc52f7-hosts-file\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720776 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-var-lib-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720792 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-node-log\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720814 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-log-socket\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.720843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720837 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-netns\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720858 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-etc-kubernetes\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720874 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwnd\" (UniqueName: \"kubernetes.io/projected/011dd50c-8a4c-425d-bb7b-86836dfc52f7-kube-api-access-pbwnd\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720888 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-registration-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720918 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-device-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720940 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720955 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-ovn\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720968 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-kubernetes\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.720991 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721014 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-lib-modules\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721033 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-tuned\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721048 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-cnibin\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.721075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721062 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721082 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721097 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-var-lib-kubelet\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721112 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-host\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721127 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-socket-dir-parent\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721144 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-k8s-cni-cncf-io\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721158 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-slash\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721170 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-bin\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721183 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-modprobe-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721195 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-cnibin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721208 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-daemon-config\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721236 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011dd50c-8a4c-425d-bb7b-86836dfc52f7-tmp-dir\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721262 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721277 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-etc-selinux\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721293 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-tmp\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721321 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-agent-certs\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.721491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721360 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721398 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-os-release\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721431 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-system-cni-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721452 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-systemd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721468 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-env-overrides\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721492 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721512 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-multus-certs\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721530 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-os-release\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721545 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba45db53-69e8-40c3-8090-98739842b87e-host\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721568 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpn7c\" (UniqueName: \"kubernetes.io/projected/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kube-api-access-qpn7c\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721586 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721622 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-conf-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721646 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqpnt\" (UniqueName: \"kubernetes.io/projected/f0f702a5-e6ca-4251-9925-a6fc437042f8-kube-api-access-gqpnt\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721672 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjtw9\" (UniqueName: \"kubernetes.io/projected/ba45db53-69e8-40c3-8090-98739842b87e-kube-api-access-rjtw9\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721703 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-socket-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721724 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-sys-fs\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.722140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721739 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721755 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l52c\" (UniqueName: \"kubernetes.io/projected/bd0fea28-dd43-4ccb-b04e-eec2e0821997-kube-api-access-9l52c\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721775 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-kubelet\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721790 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcp2l\" (UniqueName: \"kubernetes.io/projected/24bafee9-4454-456f-b2aa-04131f945624-kube-api-access-mcp2l\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721807 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-kubelet\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721821 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-etc-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721837 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42d2a92c-09c8-4e3a-aa4a-19642356de88-host-slash\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721878 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-konnectivity-ca\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721926 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdhg\" (UniqueName: \"kubernetes.io/projected/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-kube-api-access-jrdhg\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721942 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-conf\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721957 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-systemd\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721971 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-sys\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.721990 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jp9\" (UniqueName: \"kubernetes.io/projected/42d2a92c-09c8-4e3a-aa4a-19642356de88-kube-api-access-v2jp9\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722031 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722067 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-systemd-units\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722092 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-netd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.722835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722117 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-config\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722139 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-script-lib\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722163 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-run\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722188 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42d2a92c-09c8-4e3a-aa4a-19642356de88-iptables-alerter-script\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722213 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-netns\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722241 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-system-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722263 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-hostroot\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.723334 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.722288 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.744606 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.744589 2539 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:52.767612 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:52.767584 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod136192d3e6d2f42c0abb843d3e675799.slice/crio-ba28fda747985d797524d98085734036e36d20fae2b7009123b6f1fc917a24cc WatchSource:0}: Error finding container ba28fda747985d797524d98085734036e36d20fae2b7009123b6f1fc917a24cc: Status 404 returned error can't find the container with id ba28fda747985d797524d98085734036e36d20fae2b7009123b6f1fc917a24cc Apr 22 17:34:52.767831 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:52.767805 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40399284665a2832f575e85e3fe44faa.slice/crio-8b5af78cdda281b882f510b916336ac3b9cb7bd797c6712ca710defec5dea9a4 WatchSource:0}: Error finding container 8b5af78cdda281b882f510b916336ac3b9cb7bd797c6712ca710defec5dea9a4: Status 404 returned error can't find the container with id 8b5af78cdda281b882f510b916336ac3b9cb7bd797c6712ca710defec5dea9a4 Apr 22 17:34:52.769326 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.769305 2539 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wt8dn" Apr 22 17:34:52.773539 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.773522 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:52.791505 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.791483 2539 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wt8dn" Apr 22 17:34:52.822673 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822648 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.822766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822681 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-os-release\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.822766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822705 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-system-cni-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.822766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822730 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-systemd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822773 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-system-cni-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822775 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-systemd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822779 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-os-release\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822793 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-env-overrides\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822825 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822841 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-multus-certs\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822856 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-os-release\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822870 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba45db53-69e8-40c3-8090-98739842b87e-host\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.822937 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822886 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpn7c\" (UniqueName: \"kubernetes.io/projected/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kube-api-access-qpn7c\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822945 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-multus-certs\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822946 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822979 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ba45db53-69e8-40c3-8090-98739842b87e-host\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822996 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-conf-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823003 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.822993 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-os-release\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823023 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqpnt\" (UniqueName: \"kubernetes.io/projected/f0f702a5-e6ca-4251-9925-a6fc437042f8-kube-api-access-gqpnt\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823032 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-conf-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823053 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjtw9\" (UniqueName: \"kubernetes.io/projected/ba45db53-69e8-40c3-8090-98739842b87e-kube-api-access-rjtw9\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823081 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-socket-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823105 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-sys-fs\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823131 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823154 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l52c\" (UniqueName: \"kubernetes.io/projected/bd0fea28-dd43-4ccb-b04e-eec2e0821997-kube-api-access-9l52c\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823178 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-kubelet\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823204 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcp2l\" (UniqueName: \"kubernetes.io/projected/24bafee9-4454-456f-b2aa-04131f945624-kube-api-access-mcp2l\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823215 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-sys-fs\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.823354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823231 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-kubelet\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823248 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823276 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-env-overrides\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823284 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-kubelet\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823295 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-etc-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823254 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-etc-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823325 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-socket-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823333 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823355 2539 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823364 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42d2a92c-09c8-4e3a-aa4a-19642356de88-host-slash\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823383 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-konnectivity-ca\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823407 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdhg\" (UniqueName: \"kubernetes.io/projected/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-kube-api-access-jrdhg\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823409 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-kubelet\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823424 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/42d2a92c-09c8-4e3a-aa4a-19642356de88-host-slash\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823516 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-conf\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823586 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-systemd\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823627 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-conf\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823640 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-sys\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824092 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823655 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-systemd\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823670 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jp9\" (UniqueName: \"kubernetes.io/projected/42d2a92c-09c8-4e3a-aa4a-19642356de88-kube-api-access-v2jp9\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823678 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-sys\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823695 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823720 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-systemd-units\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823759 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-netd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823784 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-config\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823804 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-systemd-units\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823809 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-script-lib\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823848 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-run\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823851 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823819 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-netd\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823878 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-konnectivity-ca\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823867 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42d2a92c-09c8-4e3a-aa4a-19642356de88-iptables-alerter-script\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823937 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-run\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823959 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-netns\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.823986 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-system-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824008 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-hostroot\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.824967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824031 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-netns\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824057 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-system-cni-dir\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824033 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824076 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-hostroot\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824088 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba45db53-69e8-40c3-8090-98739842b87e-serviceca\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824113 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv85b\" (UniqueName: \"kubernetes.io/projected/a69dead2-c622-4a13-a5b4-5367b68c10a8-kube-api-access-lv85b\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824137 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysconfig\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824203 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysconfig\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824347 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-config\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824384 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovnkube-script-lib\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824391 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824421 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/42d2a92c-09c8-4e3a-aa4a-19642356de88-iptables-alerter-script\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824440 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-multus\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824464 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-sysctl-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824466 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-cni-binary-copy\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824475 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-multus\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824500 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-bin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824524 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.825737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824545 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/011dd50c-8a4c-425d-bb7b-86836dfc52f7-hosts-file\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824555 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ba45db53-69e8-40c3-8090-98739842b87e-serviceca\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824566 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-var-lib-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824553 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-var-lib-cni-bin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824590 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-node-log\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824612 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-log-socket\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824620 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/011dd50c-8a4c-425d-bb7b-86836dfc52f7-hosts-file\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824634 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824643 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-node-log\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824638 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-netns\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824613 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-var-lib-openvswitch\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824665 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-log-socket\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824682 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-netns\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824728 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-etc-kubernetes\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824757 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwnd\" (UniqueName: \"kubernetes.io/projected/011dd50c-8a4c-425d-bb7b-86836dfc52f7-kube-api-access-pbwnd\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824783 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-registration-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824828 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-etc-kubernetes\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824884 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-device-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.826424 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824938 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824963 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-registration-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824970 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-device-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.824978 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-ovn\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825003 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-cni-binary-copy\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825015 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825022 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-kubernetes\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825045 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825056 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-kubernetes\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825063 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-lib-modules\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825066 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-run-ovn\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825087 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-tuned\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.825123 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825128 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-cnibin\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825163 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.825179 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.325155824 +0000 UTC m=+2.081240380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825175 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-lib-modules\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.826928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825219 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825224 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24bafee9-4454-456f-b2aa-04131f945624-cnibin\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825248 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825289 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-var-lib-kubelet\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825322 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-host\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825347 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825361 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-socket-dir-parent\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825369 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-var-lib-kubelet\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825408 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-host\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825442 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-k8s-cni-cncf-io\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825449 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-socket-dir-parent\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825470 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-slash\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825488 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-host-run-k8s-cni-cncf-io\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825496 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-bin\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825518 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-slash\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825521 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-modprobe-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825525 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24bafee9-4454-456f-b2aa-04131f945624-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.827382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825546 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-cnibin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825575 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a69dead2-c622-4a13-a5b4-5367b68c10a8-host-cni-bin\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825599 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-daemon-config\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825629 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-modprobe-d\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825632 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011dd50c-8a4c-425d-bb7b-86836dfc52f7-tmp-dir\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825602 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f0f702a5-e6ca-4251-9925-a6fc437042f8-cnibin\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825670 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825697 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-etc-selinux\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825719 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-tmp\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825743 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-agent-certs\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825884 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/011dd50c-8a4c-425d-bb7b-86836dfc52f7-tmp-dir\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.825981 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e284c23-9cc3-44a6-ad18-85df6586fcd5-etc-selinux\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.826125 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f0f702a5-e6ca-4251-9925-a6fc437042f8-multus-daemon-config\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.826793 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a69dead2-c622-4a13-a5b4-5367b68c10a8-ovn-node-metrics-cert\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.827474 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-etc-tuned\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.827859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.827844 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/aa88e93d-3982-4f87-8dc7-cef2900ab3a3-agent-certs\") pod \"konnectivity-agent-h2ghv\" (UID: \"aa88e93d-3982-4f87-8dc7-cef2900ab3a3\") " pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:52.828331 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.827951 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd0fea28-dd43-4ccb-b04e-eec2e0821997-tmp\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.843929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.843896 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqpnt\" (UniqueName: \"kubernetes.io/projected/f0f702a5-e6ca-4251-9925-a6fc437042f8-kube-api-access-gqpnt\") pod \"multus-d4jd9\" (UID: \"f0f702a5-e6ca-4251-9925-a6fc437042f8\") " pod="openshift-multus/multus-d4jd9" Apr 22 17:34:52.844192 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.844161 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" event={"ID":"136192d3e6d2f42c0abb843d3e675799","Type":"ContainerStarted","Data":"ba28fda747985d797524d98085734036e36d20fae2b7009123b6f1fc917a24cc"} Apr 22 17:34:52.844971 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.844951 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" event={"ID":"40399284665a2832f575e85e3fe44faa","Type":"ContainerStarted","Data":"8b5af78cdda281b882f510b916336ac3b9cb7bd797c6712ca710defec5dea9a4"} Apr 22 17:34:52.847627 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.847611 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdhg\" (UniqueName: \"kubernetes.io/projected/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-kube-api-access-jrdhg\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:52.852531 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.852512 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:52.852624 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.852534 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:52.852624 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.852547 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:52.852624 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:52.852614 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.352594595 +0000 UTC m=+2.108679164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:52.853396 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.853378 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpn7c\" (UniqueName: \"kubernetes.io/projected/6e284c23-9cc3-44a6-ad18-85df6586fcd5-kube-api-access-qpn7c\") pod \"aws-ebs-csi-driver-node-ssfr9\" (UID: \"6e284c23-9cc3-44a6-ad18-85df6586fcd5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:52.853486 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.853410 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jp9\" (UniqueName: \"kubernetes.io/projected/42d2a92c-09c8-4e3a-aa4a-19642356de88-kube-api-access-v2jp9\") pod \"iptables-alerter-b6m8h\" (UID: \"42d2a92c-09c8-4e3a-aa4a-19642356de88\") " pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:52.854173 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.854154 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwnd\" (UniqueName: \"kubernetes.io/projected/011dd50c-8a4c-425d-bb7b-86836dfc52f7-kube-api-access-pbwnd\") pod \"node-resolver-dm5wr\" (UID: \"011dd50c-8a4c-425d-bb7b-86836dfc52f7\") " pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:52.854238 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.854172 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjtw9\" (UniqueName: \"kubernetes.io/projected/ba45db53-69e8-40c3-8090-98739842b87e-kube-api-access-rjtw9\") pod \"node-ca-gxs6f\" (UID: \"ba45db53-69e8-40c3-8090-98739842b87e\") " pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:52.857615 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.857598 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcp2l\" (UniqueName: \"kubernetes.io/projected/24bafee9-4454-456f-b2aa-04131f945624-kube-api-access-mcp2l\") pod \"multus-additional-cni-plugins-6gk5m\" (UID: \"24bafee9-4454-456f-b2aa-04131f945624\") " pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:52.859166 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.859151 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv85b\" (UniqueName: \"kubernetes.io/projected/a69dead2-c622-4a13-a5b4-5367b68c10a8-kube-api-access-lv85b\") pod \"ovnkube-node-dtf75\" (UID: \"a69dead2-c622-4a13-a5b4-5367b68c10a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:52.859411 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.859397 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l52c\" (UniqueName: \"kubernetes.io/projected/bd0fea28-dd43-4ccb-b04e-eec2e0821997-kube-api-access-9l52c\") pod \"tuned-ljqnc\" (UID: \"bd0fea28-dd43-4ccb-b04e-eec2e0821997\") " pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:52.878181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.878161 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5qr7d"] Apr 22 17:34:52.878437 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.878424 2539 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-5qr7d" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 22 17:34:52.878488 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.878437 2539 kubelet.go:2420] "Pod admission denied" podUID="8909ab65-d100-4702-8a4a-29958f11d0fc" pod="kube-system/global-pull-secret-syncer-5qr7d" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 22 17:34:52.926823 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.926775 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:52.926893 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.926822 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:52.926893 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:52.926860 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.025169 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.025146 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:34:53.027784 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.027769 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.027859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.027799 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.027859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.027826 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.027974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.027876 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.027974 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.027895 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:53.027974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.027939 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.027974 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.027947 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret podName:8909ab65-d100-4702-8a4a-29958f11d0fc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:53.527936997 +0000 UTC m=+2.284021551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret") pod "global-pull-secret-syncer-5qr7d" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:53.031011 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.030988 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69dead2_c622_4a13_a5b4_5367b68c10a8.slice/crio-e5e66db23b39348cbe03e8fd90a027af2ccf7f6775e909fc6e5439033332e97e WatchSource:0}: Error finding container e5e66db23b39348cbe03e8fd90a027af2ccf7f6775e909fc6e5439033332e97e: Status 404 returned error can't find the container with id e5e66db23b39348cbe03e8fd90a027af2ccf7f6775e909fc6e5439033332e97e Apr 22 17:34:53.038986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.038969 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" Apr 22 17:34:53.044536 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.044519 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0fea28_dd43_4ccb_b04e_eec2e0821997.slice/crio-519ca898e48cae7ec24eeb2f6ce471e1f6d7114008c63b05445e897709fece6c WatchSource:0}: Error finding container 519ca898e48cae7ec24eeb2f6ce471e1f6d7114008c63b05445e897709fece6c: Status 404 returned error can't find the container with id 519ca898e48cae7ec24eeb2f6ce471e1f6d7114008c63b05445e897709fece6c Apr 22 17:34:53.066536 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.066515 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dm5wr" Apr 22 17:34:53.071469 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.071448 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod011dd50c_8a4c_425d_bb7b_86836dfc52f7.slice/crio-a5dc7e0ed5c93c5fdbfbf37d83acc5ae62db027967e3d3bf5ceb7478c24f5a1a WatchSource:0}: Error finding container a5dc7e0ed5c93c5fdbfbf37d83acc5ae62db027967e3d3bf5ceb7478c24f5a1a: Status 404 returned error can't find the container with id a5dc7e0ed5c93c5fdbfbf37d83acc5ae62db027967e3d3bf5ceb7478c24f5a1a Apr 22 17:34:53.074160 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.074147 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b6m8h" Apr 22 17:34:53.079290 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.079273 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:34:53.080142 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.080125 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d2a92c_09c8_4e3a_aa4a_19642356de88.slice/crio-92d9086d789c972471f5856fdc23ce3bb980369190a0c7d761f88e223fc5327d WatchSource:0}: Error finding container 92d9086d789c972471f5856fdc23ce3bb980369190a0c7d761f88e223fc5327d: Status 404 returned error can't find the container with id 92d9086d789c972471f5856fdc23ce3bb980369190a0c7d761f88e223fc5327d Apr 22 17:34:53.085259 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.085240 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa88e93d_3982_4f87_8dc7_cef2900ab3a3.slice/crio-cba54d914f289db6b4fe55146d0c7dece0689c979fb3bfb86f1f51057e0a4211 WatchSource:0}: Error finding container cba54d914f289db6b4fe55146d0c7dece0689c979fb3bfb86f1f51057e0a4211: Status 404 returned error can't find the container with id cba54d914f289db6b4fe55146d0c7dece0689c979fb3bfb86f1f51057e0a4211 Apr 22 17:34:53.085805 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.085786 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxs6f" Apr 22 17:34:53.091950 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.091931 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba45db53_69e8_40c3_8090_98739842b87e.slice/crio-1eff74fd04457726a70cb0223c530744108c320623438d3b992af184eebb34af WatchSource:0}: Error finding container 1eff74fd04457726a70cb0223c530744108c320623438d3b992af184eebb34af: Status 404 returned error can't find the container with id 1eff74fd04457726a70cb0223c530744108c320623438d3b992af184eebb34af Apr 22 17:34:53.092045 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.092029 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4jd9" Apr 22 17:34:53.097987 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.097973 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" Apr 22 17:34:53.098087 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.098027 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f702a5_e6ca_4251_9925_a6fc437042f8.slice/crio-3bb8558870b5caf654916f00fcdf71744dd333cdf901ce182233f5b5afcc9fed WatchSource:0}: Error finding container 3bb8558870b5caf654916f00fcdf71744dd333cdf901ce182233f5b5afcc9fed: Status 404 returned error can't find the container with id 3bb8558870b5caf654916f00fcdf71744dd333cdf901ce182233f5b5afcc9fed Apr 22 17:34:53.102123 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.102110 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" Apr 22 17:34:53.105726 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.105709 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e284c23_9cc3_44a6_ad18_85df6586fcd5.slice/crio-47d94bbabd778d01d0b6870e83a15e7ddeb4511c645ecd951048e3a049f441a1 WatchSource:0}: Error finding container 47d94bbabd778d01d0b6870e83a15e7ddeb4511c645ecd951048e3a049f441a1: Status 404 returned error can't find the container with id 47d94bbabd778d01d0b6870e83a15e7ddeb4511c645ecd951048e3a049f441a1 Apr 22 17:34:53.109281 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:34:53.109263 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24bafee9_4454_456f_b2aa_04131f945624.slice/crio-057da9dbfa075c119250481a1bed24a2ec7a059feedf79a8c8992f44f98ad3cf WatchSource:0}: Error finding container 057da9dbfa075c119250481a1bed24a2ec7a059feedf79a8c8992f44f98ad3cf: Status 404 returned error can't find the container with id 057da9dbfa075c119250481a1bed24a2ec7a059feedf79a8c8992f44f98ad3cf Apr 22 17:34:53.141589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.141571 2539 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:53.329970 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.329403 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:53.329970 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.329574 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:53.329970 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.329632 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:54.329614872 +0000 UTC m=+3.085699425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:53.430509 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.429826 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:53.430509 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.430044 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:53.430509 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.430066 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:53.430509 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.430078 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:53.430509 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.430133 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:54.430113269 +0000 UTC m=+3.186197840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:53.443838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.443743 2539 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:53.530513 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.530483 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:53.530691 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.530654 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:53.530759 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.530710 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret podName:8909ab65-d100-4702-8a4a-29958f11d0fc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:54.530691805 +0000 UTC m=+3.286776370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret") pod "global-pull-secret-syncer-5qr7d" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:53.792824 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.792740 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:52 +0000 UTC" deadline="2027-10-28 10:43:17.279400438 +0000 UTC" Apr 22 17:34:53.792824 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.792778 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13289h8m23.486627053s" Apr 22 17:34:53.843810 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.843782 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:53.843981 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:53.843917 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:34:53.858990 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.858962 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerStarted","Data":"057da9dbfa075c119250481a1bed24a2ec7a059feedf79a8c8992f44f98ad3cf"} Apr 22 17:34:53.875848 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.875802 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4jd9" event={"ID":"f0f702a5-e6ca-4251-9925-a6fc437042f8","Type":"ContainerStarted","Data":"3bb8558870b5caf654916f00fcdf71744dd333cdf901ce182233f5b5afcc9fed"} Apr 22 17:34:53.894053 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.893984 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2ghv" event={"ID":"aa88e93d-3982-4f87-8dc7-cef2900ab3a3","Type":"ContainerStarted","Data":"cba54d914f289db6b4fe55146d0c7dece0689c979fb3bfb86f1f51057e0a4211"} Apr 22 17:34:53.914478 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.914429 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dm5wr" event={"ID":"011dd50c-8a4c-425d-bb7b-86836dfc52f7","Type":"ContainerStarted","Data":"a5dc7e0ed5c93c5fdbfbf37d83acc5ae62db027967e3d3bf5ceb7478c24f5a1a"} Apr 22 17:34:53.922922 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.922869 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"e5e66db23b39348cbe03e8fd90a027af2ccf7f6775e909fc6e5439033332e97e"} Apr 22 17:34:53.926181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.926157 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" event={"ID":"6e284c23-9cc3-44a6-ad18-85df6586fcd5","Type":"ContainerStarted","Data":"47d94bbabd778d01d0b6870e83a15e7ddeb4511c645ecd951048e3a049f441a1"} Apr 22 17:34:53.933681 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.933660 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus\") pod \"8909ab65-d100-4702-8a4a-29958f11d0fc\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " Apr 22 17:34:53.933768 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.933705 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config\") pod \"8909ab65-d100-4702-8a4a-29958f11d0fc\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " Apr 22 17:34:53.933947 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.933927 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config" (OuterVolumeSpecName: "kubelet-config") pod "8909ab65-d100-4702-8a4a-29958f11d0fc" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc"). InnerVolumeSpecName "kubelet-config". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 22 17:34:53.934019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.933969 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus" (OuterVolumeSpecName: "dbus") pod "8909ab65-d100-4702-8a4a-29958f11d0fc" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc"). InnerVolumeSpecName "dbus". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 22 17:34:53.948770 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.948689 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxs6f" event={"ID":"ba45db53-69e8-40c3-8090-98739842b87e","Type":"ContainerStarted","Data":"1eff74fd04457726a70cb0223c530744108c320623438d3b992af184eebb34af"} Apr 22 17:34:53.963877 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.963844 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b6m8h" event={"ID":"42d2a92c-09c8-4e3a-aa4a-19642356de88","Type":"ContainerStarted","Data":"92d9086d789c972471f5856fdc23ce3bb980369190a0c7d761f88e223fc5327d"} Apr 22 17:34:53.978054 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:53.978027 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" event={"ID":"bd0fea28-dd43-4ccb-b04e-eec2e0821997","Type":"ContainerStarted","Data":"519ca898e48cae7ec24eeb2f6ce471e1f6d7114008c63b05445e897709fece6c"} Apr 22 17:34:54.009357 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.009327 2539 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:54.034755 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.034731 2539 reconciler_common.go:299] "Volume detached for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-dbus\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:34:54.034928 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.034777 2539 reconciler_common.go:299] "Volume detached for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8909ab65-d100-4702-8a4a-29958f11d0fc-kubelet-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:34:54.336919 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.336870 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:54.337091 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.337052 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:54.337153 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.337113 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:56.337095283 +0000 UTC m=+5.093179836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:54.377190 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.377167 2539 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:54.438288 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.438257 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:54.438457 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.438442 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:54.438516 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.438466 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:54.438516 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.438480 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:54.438621 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.438538 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:34:56.438517762 +0000 UTC m=+5.194602333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:54.539136 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.539104 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:54.539307 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.539296 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:54.539371 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.539354 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret podName:8909ab65-d100-4702-8a4a-29958f11d0fc nodeName:}" failed. No retries permitted until 2026-04-22 17:34:56.53933581 +0000 UTC m=+5.295420374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret") pod "global-pull-secret-syncer-5qr7d" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:54.793841 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.793725 2539 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:52 +0000 UTC" deadline="2028-02-04 02:53:24.36772705 +0000 UTC" Apr 22 17:34:54.793841 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.793796 2539 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15657h18m29.573935568s" Apr 22 17:34:54.841954 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:54.841893 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:54.842121 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:54.842044 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:34:55.844754 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:55.844226 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:55.844754 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:55.844346 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:34:56.356100 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:56.356014 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:56.356290 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.356154 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:56.356290 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.356217 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.356198182 +0000 UTC m=+9.112282734 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:56.456633 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:56.456600 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:56.456814 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.456786 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:56.456874 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.456814 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:56.456874 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.456829 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:56.457036 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.456888 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.456868617 +0000 UTC m=+9.212953185 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:56.557250 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:56.557214 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:34:56.557432 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.557403 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:56.557506 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.557479 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret podName:8909ab65-d100-4702-8a4a-29958f11d0fc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.557457954 +0000 UTC m=+9.313542520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret") pod "global-pull-secret-syncer-5qr7d" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:34:56.842602 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:56.842080 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:56.842602 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:56.842238 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:34:57.841925 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:57.841795 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:57.842354 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:57.841937 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:34:58.842025 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:58.841992 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:34:58.842482 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:58.842124 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:34:59.842416 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:34:59.842352 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:34:59.842859 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:34:59.842563 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:00.386623 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:00.386538 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:00.386796 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.386730 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.386796 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.386796 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:08.386777875 +0000 UTC m=+17.142862440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.488032 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:00.487953 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:00.488197 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.488170 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:00.488197 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.488196 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:00.488302 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.488209 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.488302 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.488283 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:08.488263176 +0000 UTC m=+17.244347731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.589290 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:00.589256 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") pod \"global-pull-secret-syncer-5qr7d\" (UID: \"8909ab65-d100-4702-8a4a-29958f11d0fc\") " pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:35:00.595164 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.593480 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:00.595164 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.593571 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret podName:8909ab65-d100-4702-8a4a-29958f11d0fc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:08.593549359 +0000 UTC m=+17.349633924 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret") pod "global-pull-secret-syncer-5qr7d" (UID: "8909ab65-d100-4702-8a4a-29958f11d0fc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:00.842078 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:00.841987 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:00.842243 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:00.842131 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:01.843769 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.843738 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:01.844256 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:01.843860 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:01.862387 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.862354 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-5qr7d"] Apr 22 17:35:01.862575 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.862471 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5qr7d" Apr 22 17:35:01.865768 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.865535 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-5qr7d"] Apr 22 17:35:01.867488 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.867453 2539 status_manager.go:895] "Failed to get status for pod" podUID="8909ab65-d100-4702-8a4a-29958f11d0fc" pod="kube-system/global-pull-secret-syncer-5qr7d" err="pods \"global-pull-secret-syncer-5qr7d\" is forbidden: User \"system:node:ip-10-0-132-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-132-165.ec2.internal' and this object" Apr 22 17:35:01.893986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.893958 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mnmvf"] Apr 22 17:35:01.897089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.897060 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:01.897192 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:01.897145 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:01.899380 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.899350 2539 status_manager.go:895] "Failed to get status for pod" podUID="8909ab65-d100-4702-8a4a-29958f11d0fc" pod="kube-system/global-pull-secret-syncer-5qr7d" err="pods \"global-pull-secret-syncer-5qr7d\" is forbidden: User \"system:node:ip-10-0-132-165.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-132-165.ec2.internal' and this object" Apr 22 17:35:01.900615 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:01.900589 2539 reconciler_common.go:299] "Volume detached for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8909ab65-d100-4702-8a4a-29958f11d0fc-original-pull-secret\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:35:02.001918 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.001874 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-kubelet-config\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.002082 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.001930 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.002082 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.001979 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-dbus\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102357 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.102274 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-kubelet-config\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102357 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.102317 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102582 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.102366 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-dbus\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102582 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.102400 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-kubelet-config\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102582 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:02.102468 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:02.102582 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.102493 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e5741809-3555-46d2-99ca-58fd55e0acdc-dbus\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.102582 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:02.102545 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:02.60252276 +0000 UTC m=+11.358607323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:02.605486 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.605454 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:02.605637 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:02.605551 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:02.605637 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:02.605629 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:03.605607464 +0000 UTC m=+12.361692031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:02.841277 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:02.841241 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:02.841443 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:02.841365 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:03.612940 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:03.612884 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:03.613363 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:03.613063 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:03.613363 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:03.613146 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:05.613125968 +0000 UTC m=+14.369210532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:03.841556 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:03.841520 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:03.841730 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:03.841522 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:03.841730 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:03.841683 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:03.841849 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:03.841740 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:04.841882 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:04.841848 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:04.842310 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:04.842000 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:05.626032 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:05.625997 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:05.626221 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:05.626172 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:05.626298 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:05.626257 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:09.626235872 +0000 UTC m=+18.382320444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:05.841440 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:05.841404 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:05.841618 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:05.841415 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:05.841618 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:05.841512 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:05.841618 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:05.841601 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:06.841730 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:06.841654 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:06.842185 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:06.841765 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:07.841843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:07.841801 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:07.841843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:07.841844 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:07.842317 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:07.841935 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:07.842317 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:07.842055 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:08.448732 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:08.448682 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:08.448965 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.448851 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:08.448965 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.448942 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:24.448924342 +0000 UTC m=+33.205008897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:08.549713 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:08.549676 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:08.549923 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.549867 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:08.549923 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.549896 2539 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:08.550059 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.549926 2539 projected.go:194] Error preparing data for projected volume kube-api-access-2bnnd for pod openshift-network-diagnostics/network-check-target-rghzd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:08.550059 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.549984 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd podName:4c704c4d-f399-4811-9b83-dc2a18d55074 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:24.549969588 +0000 UTC m=+33.306054140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2bnnd" (UniqueName: "kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd") pod "network-check-target-rghzd" (UID: "4c704c4d-f399-4811-9b83-dc2a18d55074") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:08.841959 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:08.841845 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:08.842422 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:08.842005 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:09.658430 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:09.658373 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:09.658614 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:09.658538 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:09.658660 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:09.658615 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:17.658596876 +0000 UTC m=+26.414681433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:09.841570 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:09.841528 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:09.841734 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:09.841528 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:09.841734 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:09.841655 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:09.841734 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:09.841718 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:10.841232 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:10.841208 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:10.841557 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:10.841339 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:11.019566 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:11.019533 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" event={"ID":"136192d3e6d2f42c0abb843d3e675799","Type":"ContainerStarted","Data":"30e59c99879676a08ff8503523bae832166e56dd77040cfff3bd572d0f4ff205"} Apr 22 17:35:11.022476 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:11.022451 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4jd9" event={"ID":"f0f702a5-e6ca-4251-9925-a6fc437042f8","Type":"ContainerStarted","Data":"e82f17092ecb094632d191d4e6b6c97566ed50a9eeddaa037b1d1585f1542af3"} Apr 22 17:35:11.034268 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:11.034224 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-165.ec2.internal" podStartSLOduration=19.034211072 podStartE2EDuration="19.034211072s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:35:11.033829743 +0000 UTC m=+19.789914315" watchObservedRunningTime="2026-04-22 17:35:11.034211072 +0000 UTC m=+19.790295646" Apr 22 17:35:11.842942 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:11.842650 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:11.843431 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:11.842997 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:11.843431 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:11.842733 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:11.843431 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:11.843123 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:12.025329 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.025220 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2ghv" event={"ID":"aa88e93d-3982-4f87-8dc7-cef2900ab3a3","Type":"ContainerStarted","Data":"da5c1f351168e00e1b23d7509d1d08ed61bbfb6825e99fd7043b6e4c9a0d4289"} Apr 22 17:35:12.026478 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.026453 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dm5wr" event={"ID":"011dd50c-8a4c-425d-bb7b-86836dfc52f7","Type":"ContainerStarted","Data":"3421d962eaa8d28eb578c4601ba531cd2899e13cb663555033fc62388ed266ec"} Apr 22 17:35:12.028947 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028923 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"12dd5127453bf97fc1a7ad95a06b5e26e621e5724dd254227374f4e4032521af"} Apr 22 17:35:12.028947 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028949 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"6376bdc70efcf68debdca91d82f4b67f17e992597d72426801416fcffa19e611"} Apr 22 17:35:12.029089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028958 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"f28084749af8a01654d590af5b6aa97b765743c2037e95c5afabadcdbea6fc91"} Apr 22 17:35:12.029089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028967 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"757d5b313fddbcfef70c8b9611ff84b5e9768bc8470416c365d4bbf36b7d4657"} Apr 22 17:35:12.029089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028977 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"5ced254c1338eaaf1826b3831e0881eea3fab82ec5f69a924718764cdcb2f358"} Apr 22 17:35:12.029089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.028992 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"b7acf8d95a740a400ef86f224d314b87796abd51f332a2b69c0c0e9a8af27738"} Apr 22 17:35:12.030038 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.030020 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" event={"ID":"6e284c23-9cc3-44a6-ad18-85df6586fcd5","Type":"ContainerStarted","Data":"91d19d4a86beb0a3f42ce63c06f8c949ac6790d14674ca2a898da43c9882c146"} Apr 22 17:35:12.031181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.031164 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxs6f" event={"ID":"ba45db53-69e8-40c3-8090-98739842b87e","Type":"ContainerStarted","Data":"50564ef1e1f44b19a4009b10ad684227caf150453e14558119c7b165cfe8fc3e"} Apr 22 17:35:12.032322 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.032305 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" event={"ID":"bd0fea28-dd43-4ccb-b04e-eec2e0821997","Type":"ContainerStarted","Data":"298bcf77cb0665334da9a6b20bfffec0e0c21e00fadc8b29f6bf9832e7ccc256"} Apr 22 17:35:12.033528 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.033505 2539 generic.go:358] "Generic (PLEG): container finished" podID="40399284665a2832f575e85e3fe44faa" containerID="644e95038aeee808eae955063bf4ba6ebf706e820d02ac58f4e95b836e04150c" exitCode=0 Apr 22 17:35:12.033674 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.033563 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" event={"ID":"40399284665a2832f575e85e3fe44faa","Type":"ContainerDied","Data":"644e95038aeee808eae955063bf4ba6ebf706e820d02ac58f4e95b836e04150c"} Apr 22 17:35:12.036414 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.036392 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="fdb6a75de51c69c118a607f056d5daebe5b52f9e6fea93ca70f8e3ad9c13f107" exitCode=0 Apr 22 17:35:12.036497 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.036436 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"fdb6a75de51c69c118a607f056d5daebe5b52f9e6fea93ca70f8e3ad9c13f107"} Apr 22 17:35:12.040544 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.040512 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h2ghv" podStartSLOduration=3.428693584 podStartE2EDuration="21.040501602s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.086659597 +0000 UTC m=+1.842744150" lastFinishedPulling="2026-04-22 17:35:10.698467612 +0000 UTC m=+19.454552168" observedRunningTime="2026-04-22 17:35:12.040303029 +0000 UTC m=+20.796387614" watchObservedRunningTime="2026-04-22 17:35:12.040501602 +0000 UTC m=+20.796586175" Apr 22 17:35:12.069533 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.069491 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dm5wr" podStartSLOduration=3.426497947 podStartE2EDuration="21.069477946s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.072844958 +0000 UTC m=+1.828929509" lastFinishedPulling="2026-04-22 17:35:10.715824951 +0000 UTC m=+19.471909508" observedRunningTime="2026-04-22 17:35:12.069160379 +0000 UTC m=+20.825244953" watchObservedRunningTime="2026-04-22 17:35:12.069477946 +0000 UTC m=+20.825562519" Apr 22 17:35:12.102087 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.102035 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gxs6f" podStartSLOduration=3.481644116 podStartE2EDuration="21.102019512s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.093356154 +0000 UTC m=+1.849440706" lastFinishedPulling="2026-04-22 17:35:10.713731539 +0000 UTC m=+19.469816102" observedRunningTime="2026-04-22 17:35:12.101949366 +0000 UTC m=+20.858033940" watchObservedRunningTime="2026-04-22 17:35:12.102019512 +0000 UTC m=+20.858104084" Apr 22 17:35:12.102387 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.102353 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ljqnc" podStartSLOduration=3.431878448 podStartE2EDuration="21.102345074s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.045885666 +0000 UTC m=+1.801970222" lastFinishedPulling="2026-04-22 17:35:10.716352297 +0000 UTC m=+19.472436848" observedRunningTime="2026-04-22 17:35:12.086345997 +0000 UTC m=+20.842430570" watchObservedRunningTime="2026-04-22 17:35:12.102345074 +0000 UTC m=+20.858429649" Apr 22 17:35:12.123000 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.122961 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d4jd9" podStartSLOduration=2.475369393 podStartE2EDuration="20.122949628s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.099720487 +0000 UTC m=+1.855805041" lastFinishedPulling="2026-04-22 17:35:10.74730071 +0000 UTC m=+19.503385276" observedRunningTime="2026-04-22 17:35:12.122780824 +0000 UTC m=+20.878865396" watchObservedRunningTime="2026-04-22 17:35:12.122949628 +0000 UTC m=+20.879034201" Apr 22 17:35:12.810054 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.810025 2539 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:35:12.841801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:12.841769 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:12.841970 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:12.841888 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:13.040733 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.040649 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" event={"ID":"40399284665a2832f575e85e3fe44faa","Type":"ContainerStarted","Data":"3e4dcd9d6dee8ec446840ab4dd7373ed3ef8203750ceee51778dce4750ea5a56"} Apr 22 17:35:13.042418 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.042379 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" event={"ID":"6e284c23-9cc3-44a6-ad18-85df6586fcd5","Type":"ContainerStarted","Data":"6468e1981581360c6833796715084db1f45865b82796c5d20a2a8458f3dfdd03"} Apr 22 17:35:13.043882 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.043845 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b6m8h" event={"ID":"42d2a92c-09c8-4e3a-aa4a-19642356de88","Type":"ContainerStarted","Data":"412f800e27750508f4ebb53e8a7c1a7794ff6dc328d9b887e05f3c7f8bd97d18"} Apr 22 17:35:13.058140 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.058080 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-165.ec2.internal" podStartSLOduration=21.058062687 podStartE2EDuration="21.058062687s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:35:13.057629356 +0000 UTC m=+21.813713930" watchObservedRunningTime="2026-04-22 17:35:13.058062687 +0000 UTC m=+21.814147261" Apr 22 17:35:13.080293 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.080242 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b6m8h" podStartSLOduration=4.48349556 podStartE2EDuration="22.080225967s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.08181566 +0000 UTC m=+1.837900211" lastFinishedPulling="2026-04-22 17:35:10.678546051 +0000 UTC m=+19.434630618" observedRunningTime="2026-04-22 17:35:13.079841634 +0000 UTC m=+21.835926208" watchObservedRunningTime="2026-04-22 17:35:13.080225967 +0000 UTC m=+21.836310541" Apr 22 17:35:13.789600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.789496 2539 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:35:12.81004953Z","UUID":"fe055c00-3703-4430-99d8-c2126f7f907a","Handler":null,"Name":"","Endpoint":""} Apr 22 17:35:13.791893 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.791870 2539 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:35:13.792024 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.791934 2539 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:35:13.841980 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.841957 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:13.842108 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:13.841965 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:13.842108 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:13.842051 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:13.842211 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:13.842134 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:14.049249 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.049167 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"5d5ef121a4eba65212d327fae9fddce43a1f9e38a7cac9d2cac01ed62c0ece7b"} Apr 22 17:35:14.050962 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.050917 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" event={"ID":"6e284c23-9cc3-44a6-ad18-85df6586fcd5","Type":"ContainerStarted","Data":"47e0363f55fdca6ff07c88ce577fce16005429c2908f29aa98b193245d5aa262"} Apr 22 17:35:14.611715 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.611681 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:35:14.612321 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.612296 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:35:14.640597 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.640554 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ssfr9" podStartSLOduration=2.062793881 podStartE2EDuration="22.640539862s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.107273388 +0000 UTC m=+1.863357940" lastFinishedPulling="2026-04-22 17:35:13.68501937 +0000 UTC m=+22.441103921" observedRunningTime="2026-04-22 17:35:14.079170467 +0000 UTC m=+22.835255042" watchObservedRunningTime="2026-04-22 17:35:14.640539862 +0000 UTC m=+23.396624436" Apr 22 17:35:14.842063 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:14.842037 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:14.842216 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:14.842154 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:15.056172 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:15.054061 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:35:15.057017 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:15.056990 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h2ghv" Apr 22 17:35:15.841785 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:15.841756 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:15.841967 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:15.841864 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:15.841967 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:15.841924 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:15.842082 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:15.842028 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:16.057847 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:16.057527 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" event={"ID":"a69dead2-c622-4a13-a5b4-5367b68c10a8","Type":"ContainerStarted","Data":"d9d7ce5990b5508448c9f4c5ea90457e4f9cd571dbd6ce54b8b300aa40c6896b"} Apr 22 17:35:16.089347 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:16.089308 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" podStartSLOduration=7.070124696 podStartE2EDuration="25.089293111s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.032423647 +0000 UTC m=+1.788508201" lastFinishedPulling="2026-04-22 17:35:11.051592066 +0000 UTC m=+19.807676616" observedRunningTime="2026-04-22 17:35:16.089069324 +0000 UTC m=+24.845153898" watchObservedRunningTime="2026-04-22 17:35:16.089293111 +0000 UTC m=+24.845377683" Apr 22 17:35:16.841498 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:16.841433 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:16.841625 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:16.841556 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:17.060872 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.060842 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="76c325673d5d34d8fcf442df70b9c2a17aaf6d20ca9a642e531af114a1dace3f" exitCode=0 Apr 22 17:35:17.061596 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.060933 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"76c325673d5d34d8fcf442df70b9c2a17aaf6d20ca9a642e531af114a1dace3f"} Apr 22 17:35:17.061596 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.061330 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:17.061596 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.061356 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:17.061596 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.061369 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:17.076045 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.076024 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:17.076144 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.076103 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:17.726484 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.726274 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:17.726655 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:17.726637 2539 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:17.726724 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:17.726709 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret podName:e5741809-3555-46d2-99ca-58fd55e0acdc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:33.726686892 +0000 UTC m=+42.482771457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret") pod "global-pull-secret-syncer-mnmvf" (UID: "e5741809-3555-46d2-99ca-58fd55e0acdc") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:35:17.842125 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.842095 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:17.842223 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.842106 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:17.842223 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:17.842202 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:17.842345 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:17.842319 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:17.931308 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.931282 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rghzd"] Apr 22 17:35:17.934473 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.934451 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mnmvf"] Apr 22 17:35:17.938859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.938838 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8bxsz"] Apr 22 17:35:17.938985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:17.938946 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:17.939041 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:17.939024 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:18.064177 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:18.064084 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="f5831190561cebff63cffd5b61d4e3a3a6d96f6d0390531a168d5300e4de4651" exitCode=0 Apr 22 17:35:18.064177 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:18.064175 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:18.064553 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:18.064201 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"f5831190561cebff63cffd5b61d4e3a3a6d96f6d0390531a168d5300e4de4651"} Apr 22 17:35:18.064553 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:18.064245 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:18.064553 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:18.064407 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:18.064747 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:18.064550 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:19.842168 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:19.842140 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:19.842559 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:19.842139 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:19.842559 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:19.842263 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:19.842559 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:19.842140 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:19.842559 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:19.842317 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:19.842559 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:19.842403 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:20.070790 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:20.070763 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="1943ef9fa8c6788d02c9aa037c67d6b4568b6fecf02a7fa73da417dd24555126" exitCode=0 Apr 22 17:35:20.070887 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:20.070823 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"1943ef9fa8c6788d02c9aa037c67d6b4568b6fecf02a7fa73da417dd24555126"} Apr 22 17:35:21.842162 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:21.842084 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:21.842570 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:21.842185 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mnmvf" podUID="e5741809-3555-46d2-99ca-58fd55e0acdc" Apr 22 17:35:21.842570 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:21.842537 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:21.842688 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:21.842621 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:35:21.842688 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:21.842658 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:21.842784 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:21.842719 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rghzd" podUID="4c704c4d-f399-4811-9b83-dc2a18d55074" Apr 22 17:35:23.525653 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.525621 2539 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-165.ec2.internal" event="NodeReady" Apr 22 17:35:23.526016 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.525762 2539 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:35:23.573226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.573196 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9z52f"] Apr 22 17:35:23.603993 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.603957 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s6r9h"] Apr 22 17:35:23.604137 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.604118 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.608078 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.608049 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:35:23.608191 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.608095 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:35:23.608191 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.608128 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m6rcd\"" Apr 22 17:35:23.619377 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.619356 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9z52f"] Apr 22 17:35:23.619377 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.619379 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s6r9h"] Apr 22 17:35:23.619528 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.619477 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:23.621803 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.621759 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:35:23.621803 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.621760 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:35:23.621803 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.621797 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dt854\"" Apr 22 17:35:23.622019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.621775 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:35:23.773842 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.773811 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-config-volume\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.773985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.773873 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-tmp-dir\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.773985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.773958 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:23.774094 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.773994 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.774094 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.774035 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lvd\" (UniqueName: \"kubernetes.io/projected/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-kube-api-access-r9lvd\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.774094 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.774063 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95msz\" (UniqueName: \"kubernetes.io/projected/04320b3d-bb85-4a09-ab7d-5fdc01962b73-kube-api-access-95msz\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:23.841440 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.841415 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:23.841563 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.841415 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:23.841636 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.841415 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:23.844208 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844147 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:23.844331 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844278 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qs9jr\"" Apr 22 17:35:23.844517 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844502 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:23.844603 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844588 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z4sh7\"" Apr 22 17:35:23.844802 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844784 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:23.844864 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.844808 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:35:23.875097 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875077 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-tmp-dir\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.875192 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875125 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:23.875192 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875154 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.875265 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875196 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lvd\" (UniqueName: \"kubernetes.io/projected/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-kube-api-access-r9lvd\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.875265 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875222 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95msz\" (UniqueName: \"kubernetes.io/projected/04320b3d-bb85-4a09-ab7d-5fdc01962b73-kube-api-access-95msz\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:23.875265 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875260 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-config-volume\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.875401 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:23.875291 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:23.875401 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:23.875306 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:23.875401 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:23.875358 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:24.375338275 +0000 UTC m=+33.131422860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:23.875401 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:23.875379 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:24.375368971 +0000 UTC m=+33.131453522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:23.875622 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875400 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-tmp-dir\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.875751 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.875733 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-config-volume\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.887747 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.887723 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lvd\" (UniqueName: \"kubernetes.io/projected/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-kube-api-access-r9lvd\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:23.887869 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:23.887781 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95msz\" (UniqueName: \"kubernetes.io/projected/04320b3d-bb85-4a09-ab7d-5fdc01962b73-kube-api-access-95msz\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:24.379453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.379410 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:24.379453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.379459 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:24.379710 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.379561 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:24.379710 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.379640 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.37962186 +0000 UTC m=+34.135706426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:24.379710 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.379693 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:24.379858 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.379755 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:25.379739113 +0000 UTC m=+34.135823669 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:24.480400 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.480367 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:24.480574 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.480515 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:24.480631 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:24.480582 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:56.480562549 +0000 UTC m=+65.236647101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : secret "metrics-daemon-secret" not found Apr 22 17:35:24.581580 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.581548 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:24.584379 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.584354 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bnnd\" (UniqueName: \"kubernetes.io/projected/4c704c4d-f399-4811-9b83-dc2a18d55074-kube-api-access-2bnnd\") pod \"network-check-target-rghzd\" (UID: \"4c704c4d-f399-4811-9b83-dc2a18d55074\") " pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:24.758725 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:24.758690 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:25.388047 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:25.388013 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:25.388047 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:25.388048 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:25.388277 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:25.388146 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:25.388277 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:25.388147 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:25.388277 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:25.388200 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:27.388182821 +0000 UTC m=+36.144267374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:25.388277 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:25.388213 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:27.388207552 +0000 UTC m=+36.144292103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:25.575794 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:25.573749 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rghzd"] Apr 22 17:35:25.650325 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:35:25.650265 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c704c4d_f399_4811_9b83_dc2a18d55074.slice/crio-2c8651165f4ba16f3dcf1526355b4d0ec3e299217ca1d293fb2259a6af5325b4 WatchSource:0}: Error finding container 2c8651165f4ba16f3dcf1526355b4d0ec3e299217ca1d293fb2259a6af5325b4: Status 404 returned error can't find the container with id 2c8651165f4ba16f3dcf1526355b4d0ec3e299217ca1d293fb2259a6af5325b4 Apr 22 17:35:26.085371 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:26.085190 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rghzd" event={"ID":"4c704c4d-f399-4811-9b83-dc2a18d55074","Type":"ContainerStarted","Data":"2c8651165f4ba16f3dcf1526355b4d0ec3e299217ca1d293fb2259a6af5325b4"} Apr 22 17:35:26.087468 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:26.087441 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="6bba963371031342874c5af2a6ba46360de34aef691411855174b32c3904f676" exitCode=0 Apr 22 17:35:26.087563 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:26.087495 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"6bba963371031342874c5af2a6ba46360de34aef691411855174b32c3904f676"} Apr 22 17:35:27.092872 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:27.092837 2539 generic.go:358] "Generic (PLEG): container finished" podID="24bafee9-4454-456f-b2aa-04131f945624" containerID="a1e4aa8c63651a238e49572909ce9ba6e28db50704e59d4f3266a722c3f60a54" exitCode=0 Apr 22 17:35:27.093297 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:27.092929 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerDied","Data":"a1e4aa8c63651a238e49572909ce9ba6e28db50704e59d4f3266a722c3f60a54"} Apr 22 17:35:27.404948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:27.404881 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:27.405074 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:27.405027 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:27.405074 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:27.405027 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:27.405190 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:27.405083 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:27.405190 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:27.405124 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:31.405106198 +0000 UTC m=+40.161190749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:27.405190 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:27.405142 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:31.405132835 +0000 UTC m=+40.161217389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:28.098361 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:28.098324 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" event={"ID":"24bafee9-4454-456f-b2aa-04131f945624","Type":"ContainerStarted","Data":"0d137393e3940d478daf59c7a7d551d84e821c6c89740c3c542805bee1a2e694"} Apr 22 17:35:28.140620 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:28.140555 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6gk5m" podStartSLOduration=3.565677159 podStartE2EDuration="36.140539282s" podCreationTimestamp="2026-04-22 17:34:52 +0000 UTC" firstStartedPulling="2026-04-22 17:34:53.110586339 +0000 UTC m=+1.866670890" lastFinishedPulling="2026-04-22 17:35:25.685448459 +0000 UTC m=+34.441533013" observedRunningTime="2026-04-22 17:35:28.140512534 +0000 UTC m=+36.896597108" watchObservedRunningTime="2026-04-22 17:35:28.140539282 +0000 UTC m=+36.896623857" Apr 22 17:35:30.102859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:30.102821 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rghzd" event={"ID":"4c704c4d-f399-4811-9b83-dc2a18d55074","Type":"ContainerStarted","Data":"7f207cd03ca7e5ab92ed4848a69fcfa5cfc936dcdb037e89c63dbe30cb92cd7f"} Apr 22 17:35:30.103244 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:30.102952 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:35:30.119896 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:30.119854 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rghzd" podStartSLOduration=35.766941236 podStartE2EDuration="39.119841689s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:35:25.663877045 +0000 UTC m=+34.419961596" lastFinishedPulling="2026-04-22 17:35:29.016777482 +0000 UTC m=+37.772862049" observedRunningTime="2026-04-22 17:35:30.119301443 +0000 UTC m=+38.875386017" watchObservedRunningTime="2026-04-22 17:35:30.119841689 +0000 UTC m=+38.875926262" Apr 22 17:35:31.433530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:31.433490 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:31.433530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:31.433530 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:31.433954 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:31.433640 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:31.433954 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:31.433702 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:31.433954 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:31.433704 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:39.433688551 +0000 UTC m=+48.189773101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:31.433954 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:31.433756 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:39.43374079 +0000 UTC m=+48.189825345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:33.749404 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:33.749364 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:33.752981 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:33.752962 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e5741809-3555-46d2-99ca-58fd55e0acdc-original-pull-secret\") pod \"global-pull-secret-syncer-mnmvf\" (UID: \"e5741809-3555-46d2-99ca-58fd55e0acdc\") " pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:33.764977 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:33.764952 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mnmvf" Apr 22 17:35:33.875248 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:33.875209 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mnmvf"] Apr 22 17:35:33.879429 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:35:33.879405 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5741809_3555_46d2_99ca_58fd55e0acdc.slice/crio-1ef7c49a0bfc899011c41d53c596c7984b3d217074532292a4e0ad53a4abecb1 WatchSource:0}: Error finding container 1ef7c49a0bfc899011c41d53c596c7984b3d217074532292a4e0ad53a4abecb1: Status 404 returned error can't find the container with id 1ef7c49a0bfc899011c41d53c596c7984b3d217074532292a4e0ad53a4abecb1 Apr 22 17:35:34.111458 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:34.111372 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mnmvf" event={"ID":"e5741809-3555-46d2-99ca-58fd55e0acdc","Type":"ContainerStarted","Data":"1ef7c49a0bfc899011c41d53c596c7984b3d217074532292a4e0ad53a4abecb1"} Apr 22 17:35:39.123796 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:39.123756 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mnmvf" event={"ID":"e5741809-3555-46d2-99ca-58fd55e0acdc","Type":"ContainerStarted","Data":"498da2182b53ff322de3e762c09efb23c444d65e25c71d2d81b52f35130ad788"} Apr 22 17:35:39.492120 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:39.492080 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:39.492120 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:39.492126 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:39.492324 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:39.492218 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:39.492324 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:39.492230 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:39.492324 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:39.492282 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:35:55.492266867 +0000 UTC m=+64.248351418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:39.492324 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:39.492297 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:35:55.492289795 +0000 UTC m=+64.248374346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:41.069091 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.069037 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mnmvf" podStartSLOduration=35.850398148 podStartE2EDuration="40.06902061s" podCreationTimestamp="2026-04-22 17:35:01 +0000 UTC" firstStartedPulling="2026-04-22 17:35:33.89251489 +0000 UTC m=+42.648599454" lastFinishedPulling="2026-04-22 17:35:38.111137352 +0000 UTC m=+46.867221916" observedRunningTime="2026-04-22 17:35:39.140880323 +0000 UTC m=+47.896964900" watchObservedRunningTime="2026-04-22 17:35:41.06902061 +0000 UTC m=+49.825105182" Apr 22 17:35:41.069609 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.069593 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r"] Apr 22 17:35:41.093145 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.093112 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r"] Apr 22 17:35:41.093293 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.093183 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.095457 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.095430 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:35:41.095575 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.095491 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pccdn\"" Apr 22 17:35:41.096365 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.096333 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:35:41.096456 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.096401 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:35:41.096456 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.096405 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:35:41.204283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.204242 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4j8\" (UniqueName: \"kubernetes.io/projected/ed3533db-96b9-4be4-aada-e5abba0de7b6-kube-api-access-4m4j8\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.204466 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.204381 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed3533db-96b9-4be4-aada-e5abba0de7b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.305727 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.305683 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed3533db-96b9-4be4-aada-e5abba0de7b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.305883 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.305742 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4j8\" (UniqueName: \"kubernetes.io/projected/ed3533db-96b9-4be4-aada-e5abba0de7b6-kube-api-access-4m4j8\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.308239 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.308219 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ed3533db-96b9-4be4-aada-e5abba0de7b6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.313286 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.313264 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4j8\" (UniqueName: \"kubernetes.io/projected/ed3533db-96b9-4be4-aada-e5abba0de7b6-kube-api-access-4m4j8\") pod \"managed-serviceaccount-addon-agent-65557956df-xtp2r\" (UID: \"ed3533db-96b9-4be4-aada-e5abba0de7b6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.414750 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.414661 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" Apr 22 17:35:41.528561 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:41.528531 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r"] Apr 22 17:35:41.532020 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:35:41.531991 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded3533db_96b9_4be4_aada_e5abba0de7b6.slice/crio-a970f8b01a3b9cd6334a64191c7c06ed9c22de2d29e4a6108918788b9d33e411 WatchSource:0}: Error finding container a970f8b01a3b9cd6334a64191c7c06ed9c22de2d29e4a6108918788b9d33e411: Status 404 returned error can't find the container with id a970f8b01a3b9cd6334a64191c7c06ed9c22de2d29e4a6108918788b9d33e411 Apr 22 17:35:42.129972 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:42.129937 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" event={"ID":"ed3533db-96b9-4be4-aada-e5abba0de7b6","Type":"ContainerStarted","Data":"a970f8b01a3b9cd6334a64191c7c06ed9c22de2d29e4a6108918788b9d33e411"} Apr 22 17:35:45.137288 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:45.137252 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" event={"ID":"ed3533db-96b9-4be4-aada-e5abba0de7b6","Type":"ContainerStarted","Data":"10561562a24b32cc0c1759c65351d93c84b93ef85a3e7ea9fd5b27ae4de5c889"} Apr 22 17:35:45.153549 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:45.153495 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" podStartSLOduration=1.481377994 podStartE2EDuration="4.153480462s" podCreationTimestamp="2026-04-22 17:35:41 +0000 UTC" firstStartedPulling="2026-04-22 17:35:41.533857771 +0000 UTC m=+50.289942321" lastFinishedPulling="2026-04-22 17:35:44.205960234 +0000 UTC m=+52.962044789" observedRunningTime="2026-04-22 17:35:45.151989641 +0000 UTC m=+53.908074211" watchObservedRunningTime="2026-04-22 17:35:45.153480462 +0000 UTC m=+53.909565034" Apr 22 17:35:49.080921 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:49.080881 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dtf75" Apr 22 17:35:55.497151 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:55.497109 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:35:55.497151 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:55.497154 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:35:55.497580 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:55.497253 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:35:55.497580 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:55.497254 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:35:55.497580 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:55.497307 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:36:27.497292325 +0000 UTC m=+96.253376876 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:35:55.497580 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:55.497321 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:36:27.497315356 +0000 UTC m=+96.253399907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:35:56.505188 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:35:56.505146 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:35:56.505574 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:56.505270 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:35:56.505574 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:35:56.505321 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:00.505307664 +0000 UTC m=+129.261392215 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : secret "metrics-daemon-secret" not found Apr 22 17:36:01.107432 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:36:01.107401 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rghzd" Apr 22 17:36:27.514996 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:36:27.514960 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:36:27.515377 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:36:27.515009 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:36:27.515377 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:36:27.515114 2539 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:36:27.515377 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:36:27.515147 2539 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:36:27.515377 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:36:27.515187 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert podName:04320b3d-bb85-4a09-ab7d-5fdc01962b73 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:31.515170426 +0000 UTC m=+160.271254981 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert") pod "ingress-canary-s6r9h" (UID: "04320b3d-bb85-4a09-ab7d-5fdc01962b73") : secret "canary-serving-cert" not found Apr 22 17:36:27.515377 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:36:27.515225 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls podName:34451fd1-aa5f-4cd7-b1e2-0ef6344977dc nodeName:}" failed. No retries permitted until 2026-04-22 17:37:31.515219122 +0000 UTC m=+160.271303676 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls") pod "dns-default-9z52f" (UID: "34451fd1-aa5f-4cd7-b1e2-0ef6344977dc") : secret "dns-default-metrics-tls" not found Apr 22 17:37:00.537778 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:00.537740 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:37:00.538382 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:00.537951 2539 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:37:00.538382 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:00.538041 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs podName:a15342ff-f78f-4d33-aed1-0e9c86dbdb15 nodeName:}" failed. No retries permitted until 2026-04-22 17:39:02.538018544 +0000 UTC m=+251.294103097 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs") pod "network-metrics-daemon-8bxsz" (UID: "a15342ff-f78f-4d33-aed1-0e9c86dbdb15") : secret "metrics-daemon-secret" not found Apr 22 17:37:11.176036 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.175998 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t45p4"] Apr 22 17:37:11.178948 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.178923 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.179107 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.179071 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:37:11.181716 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.181697 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.182651 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.182623 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.182771 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.182654 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.182837 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.182776 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:37:11.182837 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.182824 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-4jgk4\"" Apr 22 17:37:11.183443 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.183427 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:37:11.183936 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.183916 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:37:11.184198 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.184181 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xzdz4\"" Apr 22 17:37:11.184412 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.184397 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:37:11.184502 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.184487 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:37:11.191769 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.191746 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t45p4"] Apr 22 17:37:11.198921 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.198883 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:37:11.200387 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.200359 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:37:11.201218 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.201200 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:37:11.277566 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.277543 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x"] Apr 22 17:37:11.280205 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.280188 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" Apr 22 17:37:11.280636 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.280619 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q"] Apr 22 17:37:11.282714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.282691 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.282933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.282915 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.283016 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.283001 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.283379 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.283362 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xg5kw\"" Apr 22 17:37:11.284534 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.284514 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm"] Apr 22 17:37:11.285288 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.285242 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:37:11.285400 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.285243 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rcvwz\"" Apr 22 17:37:11.285618 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.285595 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.285710 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.285672 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:37:11.285710 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.285703 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.287293 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.287277 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lmv9z"] Apr 22 17:37:11.287430 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.287412 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.289649 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.289633 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.289887 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.289871 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.289974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.289890 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.290590 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.290571 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x"] Apr 22 17:37:11.292963 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.292945 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x2ht5\"" Apr 22 17:37:11.293382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.293126 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.293382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.293224 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.293382 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.293257 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:37:11.294883 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.294861 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:37:11.294999 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.294885 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q"] Apr 22 17:37:11.295061 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.295009 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:37:11.295156 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.295139 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-dgthp\"" Apr 22 17:37:11.299857 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.299838 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:37:11.303268 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.303247 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lmv9z"] Apr 22 17:37:11.305789 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.305768 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm"] Apr 22 17:37:11.308388 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308365 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.308485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308399 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-tmp\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.308546 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308502 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.308546 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308527 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.308646 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308545 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65f4e6af-46be-4594-b3fb-54dd4fac761b-serving-cert\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.308646 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308561 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4jkl\" (UniqueName: \"kubernetes.io/projected/65f4e6af-46be-4594-b3fb-54dd4fac761b-kube-api-access-q4jkl\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.308801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308645 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.308801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308680 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.308801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308707 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sx5k\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.308801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308740 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-snapshots\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.309014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308798 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.309014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308827 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-service-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.309014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308857 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.309014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.308883 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.379097 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.379075 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b7cbdcdbb-cllk8"] Apr 22 17:37:11.385503 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.385483 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.387770 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.387754 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:37:11.388164 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388133 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:37:11.388164 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388161 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:37:11.388310 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388161 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:37:11.388310 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388257 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:37:11.388494 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388448 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rfd76\"" Apr 22 17:37:11.388616 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.388601 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:37:11.398248 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.398229 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b7cbdcdbb-cllk8"] Apr 22 17:37:11.409548 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409528 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-tmp\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.409640 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409556 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-trusted-ca\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.409640 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409573 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmm5\" (UniqueName: \"kubernetes.io/projected/c64def61-a93f-45f9-a89f-c469f37561b6-kube-api-access-glmm5\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.409640 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409608 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.409640 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409638 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.409846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409655 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65f4e6af-46be-4594-b3fb-54dd4fac761b-serving-cert\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.409846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409670 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4jkl\" (UniqueName: \"kubernetes.io/projected/65f4e6af-46be-4594-b3fb-54dd4fac761b-kube-api-access-q4jkl\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.409846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409687 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64def61-a93f-45f9-a89f-c469f37561b6-serving-cert\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.409846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409703 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grh6s\" (UniqueName: \"kubernetes.io/projected/2f845b61-dcdc-4d78-b24e-77b96b4792a1-kube-api-access-grh6s\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.409846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409745 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f845b61-dcdc-4d78-b24e-77b96b4792a1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409860 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409889 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409928 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-tmp\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409931 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm5c9\" (UniqueName: \"kubernetes.io/projected/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-kube-api-access-qm5c9\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.409976 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sx5k\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410003 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-snapshots\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.409978 2539 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.410025 2539 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c98b6bc6c-j57rh: secret "image-registry-tls" not found Apr 22 17:37:11.410050 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410030 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-service-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410057 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv72d\" (UniqueName: \"kubernetes.io/projected/26cdbfea-d567-4835-9ba0-b68b5e295c7a-kube-api-access-pv72d\") pod \"volume-data-source-validator-7c6cbb6c87-bk65x\" (UID: \"26cdbfea-d567-4835-9ba0-b68b5e295c7a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.410072 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls podName:02ac9a15-99e8-4bc0-bb94-79c3df1bdde4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:11.910058447 +0000 UTC m=+140.666142998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls") pod "image-registry-6c98b6bc6c-j57rh" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4") : secret "image-registry-tls" not found Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410191 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410217 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410238 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410366 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f845b61-dcdc-4d78-b24e-77b96b4792a1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.410701 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410412 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410701 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410449 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410701 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410491 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-config\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.410701 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410592 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.410701 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410632 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/65f4e6af-46be-4594-b3fb-54dd4fac761b-snapshots\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.410961 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.410946 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-service-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.411138 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.411119 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65f4e6af-46be-4594-b3fb-54dd4fac761b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.411595 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.411570 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.411825 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.411805 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.412328 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.412310 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.412650 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.412630 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65f4e6af-46be-4594-b3fb-54dd4fac761b-serving-cert\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.413259 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.413242 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.422333 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.422312 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.422410 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.422312 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4jkl\" (UniqueName: \"kubernetes.io/projected/65f4e6af-46be-4594-b3fb-54dd4fac761b-kube-api-access-q4jkl\") pod \"insights-operator-585dfdc468-t45p4\" (UID: \"65f4e6af-46be-4594-b3fb-54dd4fac761b\") " pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.422837 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.422815 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sx5k\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.489792 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.489763 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-t45p4" Apr 22 17:37:11.511724 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511696 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.511812 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511745 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqdh\" (UniqueName: \"kubernetes.io/projected/961e6502-070a-4f99-9ed6-a2a445aefbb3-kube-api-access-8zqdh\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.511812 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511797 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.511895 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511825 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-stats-auth\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.511895 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511854 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f845b61-dcdc-4d78-b24e-77b96b4792a1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.511895 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511892 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-config\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.512076 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511937 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-default-certificate\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.512076 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511973 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-trusted-ca\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.512076 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.511998 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glmm5\" (UniqueName: \"kubernetes.io/projected/c64def61-a93f-45f9-a89f-c469f37561b6-kube-api-access-glmm5\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.512227 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512181 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64def61-a93f-45f9-a89f-c469f37561b6-serving-cert\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.512287 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512212 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grh6s\" (UniqueName: \"kubernetes.io/projected/2f845b61-dcdc-4d78-b24e-77b96b4792a1-kube-api-access-grh6s\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.512287 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512259 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f845b61-dcdc-4d78-b24e-77b96b4792a1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.512591 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512556 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.512704 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512593 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-config\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.512704 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512629 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm5c9\" (UniqueName: \"kubernetes.io/projected/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-kube-api-access-qm5c9\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.512704 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.512680 2539 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:37:11.512983 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.512736 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls podName:3bd2c2ae-f38a-4267-8f67-bcab90c4db21 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:12.012717856 +0000 UTC m=+140.768802421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsgtm" (UID: "3bd2c2ae-f38a-4267-8f67-bcab90c4db21") : secret "samples-operator-tls" not found Apr 22 17:37:11.512983 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512870 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f845b61-dcdc-4d78-b24e-77b96b4792a1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.512983 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.512667 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv72d\" (UniqueName: \"kubernetes.io/projected/26cdbfea-d567-4835-9ba0-b68b5e295c7a-kube-api-access-pv72d\") pod \"volume-data-source-validator-7c6cbb6c87-bk65x\" (UID: \"26cdbfea-d567-4835-9ba0-b68b5e295c7a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" Apr 22 17:37:11.513174 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.513042 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c64def61-a93f-45f9-a89f-c469f37561b6-trusted-ca\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.514117 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.514099 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f845b61-dcdc-4d78-b24e-77b96b4792a1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.514460 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.514441 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64def61-a93f-45f9-a89f-c469f37561b6-serving-cert\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.522860 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.522833 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grh6s\" (UniqueName: \"kubernetes.io/projected/2f845b61-dcdc-4d78-b24e-77b96b4792a1-kube-api-access-grh6s\") pod \"kube-storage-version-migrator-operator-6769c5d45-gcd2q\" (UID: \"2f845b61-dcdc-4d78-b24e-77b96b4792a1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.523413 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.523385 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv72d\" (UniqueName: \"kubernetes.io/projected/26cdbfea-d567-4835-9ba0-b68b5e295c7a-kube-api-access-pv72d\") pod \"volume-data-source-validator-7c6cbb6c87-bk65x\" (UID: \"26cdbfea-d567-4835-9ba0-b68b5e295c7a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" Apr 22 17:37:11.523512 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.523489 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm5c9\" (UniqueName: \"kubernetes.io/projected/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-kube-api-access-qm5c9\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:11.523705 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.523684 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmm5\" (UniqueName: \"kubernetes.io/projected/c64def61-a93f-45f9-a89f-c469f37561b6-kube-api-access-glmm5\") pod \"console-operator-9d4b6777b-lmv9z\" (UID: \"c64def61-a93f-45f9-a89f-c469f37561b6\") " pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.590728 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.590703 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" Apr 22 17:37:11.597398 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.597378 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" Apr 22 17:37:11.603396 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.603373 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-t45p4"] Apr 22 17:37:11.605678 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:11.605611 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f4e6af_46be_4594_b3fb_54dd4fac761b.slice/crio-7de65154e3daed3a337cb16d577deac147e0904daf1cea2f7461dd35e2e63ef6 WatchSource:0}: Error finding container 7de65154e3daed3a337cb16d577deac147e0904daf1cea2f7461dd35e2e63ef6: Status 404 returned error can't find the container with id 7de65154e3daed3a337cb16d577deac147e0904daf1cea2f7461dd35e2e63ef6 Apr 22 17:37:11.609874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.609733 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:11.613909 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.613874 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.613977 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.613938 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqdh\" (UniqueName: \"kubernetes.io/projected/961e6502-070a-4f99-9ed6-a2a445aefbb3-kube-api-access-8zqdh\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.614020 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.613979 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.614020 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.613986 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:11.614020 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.614008 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-stats-auth\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.614114 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.614058 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:12.114042373 +0000 UTC m=+140.870126929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:11.614114 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.614077 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-default-certificate\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.614694 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.614667 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:12.114627611 +0000 UTC m=+140.870712177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:11.617348 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.617326 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-default-certificate\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.617519 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.617368 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-stats-auth\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.625004 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.624975 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqdh\" (UniqueName: \"kubernetes.io/projected/961e6502-070a-4f99-9ed6-a2a445aefbb3-kube-api-access-8zqdh\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:11.722639 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.722605 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x"] Apr 22 17:37:11.725808 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:11.725778 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cdbfea_d567_4835_9ba0_b68b5e295c7a.slice/crio-bc91cab8b44e196734a0880bf163332f430a283156882379145b54e1593c9a2d WatchSource:0}: Error finding container bc91cab8b44e196734a0880bf163332f430a283156882379145b54e1593c9a2d: Status 404 returned error can't find the container with id bc91cab8b44e196734a0880bf163332f430a283156882379145b54e1593c9a2d Apr 22 17:37:11.737738 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.737717 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q"] Apr 22 17:37:11.740671 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:11.740606 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f845b61_dcdc_4d78_b24e_77b96b4792a1.slice/crio-096b4aea5d49fd232bcbe24154e2b714e61aa76c7196388b218988f7d504effa WatchSource:0}: Error finding container 096b4aea5d49fd232bcbe24154e2b714e61aa76c7196388b218988f7d504effa: Status 404 returned error can't find the container with id 096b4aea5d49fd232bcbe24154e2b714e61aa76c7196388b218988f7d504effa Apr 22 17:37:11.752552 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.752531 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lmv9z"] Apr 22 17:37:11.754887 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:11.754862 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc64def61_a93f_45f9_a89f_c469f37561b6.slice/crio-a9d6d6211568beeadfd2ab5010ffd7232122451b632a3bb6916574d00e6f2b33 WatchSource:0}: Error finding container a9d6d6211568beeadfd2ab5010ffd7232122451b632a3bb6916574d00e6f2b33: Status 404 returned error can't find the container with id a9d6d6211568beeadfd2ab5010ffd7232122451b632a3bb6916574d00e6f2b33 Apr 22 17:37:11.916841 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:11.916812 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:11.917016 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.916969 2539 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:37:11.917016 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.916986 2539 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c98b6bc6c-j57rh: secret "image-registry-tls" not found Apr 22 17:37:11.917094 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:11.917037 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls podName:02ac9a15-99e8-4bc0-bb94-79c3df1bdde4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:12.917018949 +0000 UTC m=+141.673103520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls") pod "image-registry-6c98b6bc6c-j57rh" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4") : secret "image-registry-tls" not found Apr 22 17:37:12.017859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.017793 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:12.018021 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.017966 2539 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:37:12.018084 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.018027 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls podName:3bd2c2ae-f38a-4267-8f67-bcab90c4db21 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:13.018012269 +0000 UTC m=+141.774096820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsgtm" (UID: "3bd2c2ae-f38a-4267-8f67-bcab90c4db21") : secret "samples-operator-tls" not found Apr 22 17:37:12.119189 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.119162 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:12.119352 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.119209 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:12.119352 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.119330 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:13.119314984 +0000 UTC m=+141.875399555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:12.119691 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.119308 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:12.119990 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.119963 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:13.119934526 +0000 UTC m=+141.876019090 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:12.305627 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.305498 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" event={"ID":"c64def61-a93f-45f9-a89f-c469f37561b6","Type":"ContainerStarted","Data":"a9d6d6211568beeadfd2ab5010ffd7232122451b632a3bb6916574d00e6f2b33"} Apr 22 17:37:12.307786 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.307726 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" event={"ID":"2f845b61-dcdc-4d78-b24e-77b96b4792a1","Type":"ContainerStarted","Data":"096b4aea5d49fd232bcbe24154e2b714e61aa76c7196388b218988f7d504effa"} Apr 22 17:37:12.309027 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.308975 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" event={"ID":"26cdbfea-d567-4835-9ba0-b68b5e295c7a","Type":"ContainerStarted","Data":"bc91cab8b44e196734a0880bf163332f430a283156882379145b54e1593c9a2d"} Apr 22 17:37:12.310137 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.310096 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t45p4" event={"ID":"65f4e6af-46be-4594-b3fb-54dd4fac761b","Type":"ContainerStarted","Data":"7de65154e3daed3a337cb16d577deac147e0904daf1cea2f7461dd35e2e63ef6"} Apr 22 17:37:12.926757 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:12.926717 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:12.926949 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.926890 2539 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:37:12.926949 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.926927 2539 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c98b6bc6c-j57rh: secret "image-registry-tls" not found Apr 22 17:37:12.927048 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:12.927001 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls podName:02ac9a15-99e8-4bc0-bb94-79c3df1bdde4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:14.92697964 +0000 UTC m=+143.683064206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls") pod "image-registry-6c98b6bc6c-j57rh" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4") : secret "image-registry-tls" not found Apr 22 17:37:13.027302 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:13.027263 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:13.027554 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:13.027513 2539 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:37:13.027676 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:13.027608 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls podName:3bd2c2ae-f38a-4267-8f67-bcab90c4db21 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:15.027585454 +0000 UTC m=+143.783670008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsgtm" (UID: "3bd2c2ae-f38a-4267-8f67-bcab90c4db21") : secret "samples-operator-tls" not found Apr 22 17:37:13.128399 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:13.128206 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:13.128399 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:13.128273 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:13.128399 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:13.128369 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:13.128676 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:13.128443 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:15.128422335 +0000 UTC m=+143.884506901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:13.128676 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:13.128462 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:15.128452915 +0000 UTC m=+143.884537476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:14.942637 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:14.942578 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:14.942933 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:14.942743 2539 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:37:14.942933 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:14.942765 2539 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c98b6bc6c-j57rh: secret "image-registry-tls" not found Apr 22 17:37:14.942933 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:14.942831 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls podName:02ac9a15-99e8-4bc0-bb94-79c3df1bdde4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:18.942811288 +0000 UTC m=+147.698895842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls") pod "image-registry-6c98b6bc6c-j57rh" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4") : secret "image-registry-tls" not found Apr 22 17:37:15.043315 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.043286 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:15.043457 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:15.043438 2539 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:37:15.043517 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:15.043508 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls podName:3bd2c2ae-f38a-4267-8f67-bcab90c4db21 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:19.043489151 +0000 UTC m=+147.799573708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsgtm" (UID: "3bd2c2ae-f38a-4267-8f67-bcab90c4db21") : secret "samples-operator-tls" not found Apr 22 17:37:15.144090 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.144065 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:15.144210 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.144119 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:15.144210 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:15.144196 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:15.144307 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:15.144245 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:19.144232591 +0000 UTC m=+147.900317142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:15.144307 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:15.144273 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:19.144252704 +0000 UTC m=+147.900337255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:15.317360 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.317329 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" event={"ID":"26cdbfea-d567-4835-9ba0-b68b5e295c7a","Type":"ContainerStarted","Data":"1d857e5353b394e396e426a0f6febc7514b2508d25bf2b150d4348b0fe9995fb"} Apr 22 17:37:15.318731 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.318706 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t45p4" event={"ID":"65f4e6af-46be-4594-b3fb-54dd4fac761b","Type":"ContainerStarted","Data":"b3a346e683256e4f9df9c048d56b4cdcca3d93ec4b78de61f1471b3c560f1522"} Apr 22 17:37:15.320112 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.320094 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/0.log" Apr 22 17:37:15.320214 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.320132 2539 generic.go:358] "Generic (PLEG): container finished" podID="c64def61-a93f-45f9-a89f-c469f37561b6" containerID="03556a006c6ea53956660955245f2b7ac27e4d0073daea779a82ad035ef8fb46" exitCode=255 Apr 22 17:37:15.320214 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.320201 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" event={"ID":"c64def61-a93f-45f9-a89f-c469f37561b6","Type":"ContainerDied","Data":"03556a006c6ea53956660955245f2b7ac27e4d0073daea779a82ad035ef8fb46"} Apr 22 17:37:15.320383 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.320367 2539 scope.go:117] "RemoveContainer" containerID="03556a006c6ea53956660955245f2b7ac27e4d0073daea779a82ad035ef8fb46" Apr 22 17:37:15.321544 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.321517 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" event={"ID":"2f845b61-dcdc-4d78-b24e-77b96b4792a1","Type":"ContainerStarted","Data":"5a8d6856e762d67f240bd1c3f69d4c526b200aa4059b157b02121e4751791f96"} Apr 22 17:37:15.333061 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.333017 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bk65x" podStartSLOduration=1.382147966 podStartE2EDuration="4.333002962s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:11.728162707 +0000 UTC m=+140.484247263" lastFinishedPulling="2026-04-22 17:37:14.679017697 +0000 UTC m=+143.435102259" observedRunningTime="2026-04-22 17:37:15.332478099 +0000 UTC m=+144.088562673" watchObservedRunningTime="2026-04-22 17:37:15.333002962 +0000 UTC m=+144.089087539" Apr 22 17:37:15.350943 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.350885 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-t45p4" podStartSLOduration=1.2812561279999999 podStartE2EDuration="4.350876991s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:11.609597644 +0000 UTC m=+140.365682198" lastFinishedPulling="2026-04-22 17:37:14.679218498 +0000 UTC m=+143.435303061" observedRunningTime="2026-04-22 17:37:15.350304589 +0000 UTC m=+144.106389170" watchObservedRunningTime="2026-04-22 17:37:15.350876991 +0000 UTC m=+144.106961564" Apr 22 17:37:15.432785 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:15.432747 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" podStartSLOduration=1.488461636 podStartE2EDuration="4.432733173s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:11.741974506 +0000 UTC m=+140.498059061" lastFinishedPulling="2026-04-22 17:37:14.686246045 +0000 UTC m=+143.442330598" observedRunningTime="2026-04-22 17:37:15.432467872 +0000 UTC m=+144.188552448" watchObservedRunningTime="2026-04-22 17:37:15.432733173 +0000 UTC m=+144.188817740" Apr 22 17:37:16.325314 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.325286 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:37:16.325750 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.325681 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/0.log" Apr 22 17:37:16.325750 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.325716 2539 generic.go:358] "Generic (PLEG): container finished" podID="c64def61-a93f-45f9-a89f-c469f37561b6" containerID="bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7" exitCode=255 Apr 22 17:37:16.325861 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.325817 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" event={"ID":"c64def61-a93f-45f9-a89f-c469f37561b6","Type":"ContainerDied","Data":"bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7"} Apr 22 17:37:16.325930 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.325867 2539 scope.go:117] "RemoveContainer" containerID="03556a006c6ea53956660955245f2b7ac27e4d0073daea779a82ad035ef8fb46" Apr 22 17:37:16.326090 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:16.326075 2539 scope.go:117] "RemoveContainer" containerID="bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7" Apr 22 17:37:16.326298 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:16.326278 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lmv9z_openshift-console-operator(c64def61-a93f-45f9-a89f-c469f37561b6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" podUID="c64def61-a93f-45f9-a89f-c469f37561b6" Apr 22 17:37:17.330104 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:17.330078 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:37:17.330561 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:17.330397 2539 scope.go:117] "RemoveContainer" containerID="bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7" Apr 22 17:37:17.330604 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:17.330569 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lmv9z_openshift-console-operator(c64def61-a93f-45f9-a89f-c469f37561b6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" podUID="c64def61-a93f-45f9-a89f-c469f37561b6" Apr 22 17:37:18.827244 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:18.827217 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dm5wr_011dd50c-8a4c-425d-bb7b-86836dfc52f7/dns-node-resolver/0.log" Apr 22 17:37:18.974343 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:18.974287 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:18.974542 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:18.974433 2539 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:37:18.974542 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:18.974452 2539 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c98b6bc6c-j57rh: secret "image-registry-tls" not found Apr 22 17:37:18.974542 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:18.974528 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls podName:02ac9a15-99e8-4bc0-bb94-79c3df1bdde4 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:26.974511297 +0000 UTC m=+155.730595847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls") pod "image-registry-6c98b6bc6c-j57rh" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4") : secret "image-registry-tls" not found Apr 22 17:37:19.075485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:19.075455 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:19.075646 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:19.075627 2539 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:37:19.075700 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:19.075691 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls podName:3bd2c2ae-f38a-4267-8f67-bcab90c4db21 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:27.075676329 +0000 UTC m=+155.831760879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsgtm" (UID: "3bd2c2ae-f38a-4267-8f67-bcab90c4db21") : secret "samples-operator-tls" not found Apr 22 17:37:19.176491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:19.176410 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:19.176491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:19.176461 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:19.176673 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:19.176552 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:19.176673 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:19.176615 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:27.176600342 +0000 UTC m=+155.932684894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:19.176673 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:19.176629 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:27.176622967 +0000 UTC m=+155.932707518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:19.626249 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:19.626218 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxs6f_ba45db53-69e8-40c3-8090-98739842b87e/node-ca/0.log" Apr 22 17:37:21.610348 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:21.610315 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:21.610348 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:21.610355 2539 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:21.610766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:21.610691 2539 scope.go:117] "RemoveContainer" containerID="bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7" Apr 22 17:37:21.610865 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:21.610845 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lmv9z_openshift-console-operator(c64def61-a93f-45f9-a89f-c469f37561b6)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" podUID="c64def61-a93f-45f9-a89f-c469f37561b6" Apr 22 17:37:26.615511 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:26.615471 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9z52f" podUID="34451fd1-aa5f-4cd7-b1e2-0ef6344977dc" Apr 22 17:37:26.629631 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:26.629599 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s6r9h" podUID="04320b3d-bb85-4a09-ab7d-5fdc01962b73" Apr 22 17:37:26.852681 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:26.852640 2539 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8bxsz" podUID="a15342ff-f78f-4d33-aed1-0e9c86dbdb15" Apr 22 17:37:27.038614 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.038583 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:27.040846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.040824 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"image-registry-6c98b6bc6c-j57rh\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:27.098038 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.097996 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:27.139878 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.139833 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:27.142642 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.142613 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bd2c2ae-f38a-4267-8f67-bcab90c4db21-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsgtm\" (UID: \"3bd2c2ae-f38a-4267-8f67-bcab90c4db21\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:27.204472 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.204437 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" Apr 22 17:37:27.213418 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.213390 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:37:27.217097 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:27.217075 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ac9a15_99e8_4bc0_bb94_79c3df1bdde4.slice/crio-f5f5d843190c30c7afe880422d38d322343608f0331a164ba68d950f6638588b WatchSource:0}: Error finding container f5f5d843190c30c7afe880422d38d322343608f0331a164ba68d950f6638588b: Status 404 returned error can't find the container with id f5f5d843190c30c7afe880422d38d322343608f0331a164ba68d950f6638588b Apr 22 17:37:27.240917 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.240465 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:27.240917 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.240526 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:27.240917 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:27.240667 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:43.240645386 +0000 UTC m=+171.996729939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : configmap references non-existent config key: service-ca.crt Apr 22 17:37:27.240917 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:27.240806 2539 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:37:27.240917 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:27.240855 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs podName:961e6502-070a-4f99-9ed6-a2a445aefbb3 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:43.240842011 +0000 UTC m=+171.996926566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs") pod "router-default-7b7cbdcdbb-cllk8" (UID: "961e6502-070a-4f99-9ed6-a2a445aefbb3") : secret "router-metrics-certs-default" not found Apr 22 17:37:27.320548 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.320472 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm"] Apr 22 17:37:27.354692 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.354661 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" event={"ID":"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4","Type":"ContainerStarted","Data":"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625"} Apr 22 17:37:27.354840 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.354697 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" event={"ID":"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4","Type":"ContainerStarted","Data":"f5f5d843190c30c7afe880422d38d322343608f0331a164ba68d950f6638588b"} Apr 22 17:37:27.354840 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.354745 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:27.354986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.354836 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:27.377244 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:27.377199 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" podStartSLOduration=16.377184087 podStartE2EDuration="16.377184087s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:27.376505528 +0000 UTC m=+156.132590134" watchObservedRunningTime="2026-04-22 17:37:27.377184087 +0000 UTC m=+156.133268659" Apr 22 17:37:28.358921 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:28.358863 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" event={"ID":"3bd2c2ae-f38a-4267-8f67-bcab90c4db21","Type":"ContainerStarted","Data":"2caab2d419432190a1e65e74002343bc10cc65c2da7a3bde2184ca34f29eedee"} Apr 22 17:37:29.362918 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:29.362865 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" event={"ID":"3bd2c2ae-f38a-4267-8f67-bcab90c4db21","Type":"ContainerStarted","Data":"a1566158a0df93e76ddfe66cf8bd5e432dd3a228cca5f0285c8768aa0e40a44e"} Apr 22 17:37:29.362918 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:29.362924 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" event={"ID":"3bd2c2ae-f38a-4267-8f67-bcab90c4db21","Type":"ContainerStarted","Data":"7f1fbccb096a4be6a7240d935b671e2e690626cefdcb7ecddef0b9514a458b56"} Apr 22 17:37:29.379721 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:29.379674 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsgtm" podStartSLOduration=16.913180109 podStartE2EDuration="18.379659296s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:27.356926613 +0000 UTC m=+156.113011180" lastFinishedPulling="2026-04-22 17:37:28.823405807 +0000 UTC m=+157.579490367" observedRunningTime="2026-04-22 17:37:29.379350382 +0000 UTC m=+158.135434955" watchObservedRunningTime="2026-04-22 17:37:29.379659296 +0000 UTC m=+158.135743870" Apr 22 17:37:31.577141 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.577104 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:37:31.577532 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.577199 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:31.579528 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.579508 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34451fd1-aa5f-4cd7-b1e2-0ef6344977dc-metrics-tls\") pod \"dns-default-9z52f\" (UID: \"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc\") " pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:31.579690 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.579671 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04320b3d-bb85-4a09-ab7d-5fdc01962b73-cert\") pod \"ingress-canary-s6r9h\" (UID: \"04320b3d-bb85-4a09-ab7d-5fdc01962b73\") " pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:37:31.845929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.845844 2539 scope.go:117] "RemoveContainer" containerID="bcbca4d38f1b8f4a17350189ec4a90dadfaef10d279df0dfb76d704133d536a7" Apr 22 17:37:31.858011 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.857986 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m6rcd\"" Apr 22 17:37:31.866540 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.866521 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:31.988769 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:31.988740 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9z52f"] Apr 22 17:37:31.991767 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:31.991740 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34451fd1_aa5f_4cd7_b1e2_0ef6344977dc.slice/crio-c82d2404177c96f1f218039c75d423e7a36817ac6a844e0de186704c21157859 WatchSource:0}: Error finding container c82d2404177c96f1f218039c75d423e7a36817ac6a844e0de186704c21157859: Status 404 returned error can't find the container with id c82d2404177c96f1f218039c75d423e7a36817ac6a844e0de186704c21157859 Apr 22 17:37:32.375689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.375659 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:37:32.375874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.375758 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" event={"ID":"c64def61-a93f-45f9-a89f-c469f37561b6","Type":"ContainerStarted","Data":"5cf26c1a09bc933bfc6ce585099e3011434f2669a9f916f7789f41f7fa2cff04"} Apr 22 17:37:32.376729 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.376216 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:32.377085 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.377053 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9z52f" event={"ID":"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc","Type":"ContainerStarted","Data":"c82d2404177c96f1f218039c75d423e7a36817ac6a844e0de186704c21157859"} Apr 22 17:37:32.394632 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.394609 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" Apr 22 17:37:32.396872 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:32.396825 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-lmv9z" podStartSLOduration=18.469618925 podStartE2EDuration="21.396813349s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="2026-04-22 17:37:11.756421628 +0000 UTC m=+140.512506191" lastFinishedPulling="2026-04-22 17:37:14.68361605 +0000 UTC m=+143.439700615" observedRunningTime="2026-04-22 17:37:32.395111757 +0000 UTC m=+161.151196331" watchObservedRunningTime="2026-04-22 17:37:32.396813349 +0000 UTC m=+161.152897922" Apr 22 17:37:34.384142 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:34.384103 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9z52f" event={"ID":"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc","Type":"ContainerStarted","Data":"033f6d47f39466577c741397121215387aaf3be8a0272e5b2b1d08699e43a508"} Apr 22 17:37:34.384519 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:34.384148 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9z52f" event={"ID":"34451fd1-aa5f-4cd7-b1e2-0ef6344977dc","Type":"ContainerStarted","Data":"57fa9ee43f650bfeb2906b7ecadf464426e1b17885cac0a9f3641810ca58db14"} Apr 22 17:37:34.384519 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:34.384329 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:34.404303 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:34.404260 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9z52f" podStartSLOduration=129.851696875 podStartE2EDuration="2m11.404247951s" podCreationTimestamp="2026-04-22 17:35:23 +0000 UTC" firstStartedPulling="2026-04-22 17:37:31.993558018 +0000 UTC m=+160.749642570" lastFinishedPulling="2026-04-22 17:37:33.546109081 +0000 UTC m=+162.302193646" observedRunningTime="2026-04-22 17:37:34.403579408 +0000 UTC m=+163.159663980" watchObservedRunningTime="2026-04-22 17:37:34.404247951 +0000 UTC m=+163.160332517" Apr 22 17:37:39.844795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:39.844757 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:37:39.847728 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:39.847710 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dt854\"" Apr 22 17:37:39.855455 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:39.855438 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s6r9h" Apr 22 17:37:39.973491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:39.973459 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s6r9h"] Apr 22 17:37:39.976701 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:39.976677 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04320b3d_bb85_4a09_ab7d_5fdc01962b73.slice/crio-aec212c12f46d47752eb573e3fc54765847b8990bf589087e31c95545c824ace WatchSource:0}: Error finding container aec212c12f46d47752eb573e3fc54765847b8990bf589087e31c95545c824ace: Status 404 returned error can't find the container with id aec212c12f46d47752eb573e3fc54765847b8990bf589087e31c95545c824ace Apr 22 17:37:40.404622 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:40.404543 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s6r9h" event={"ID":"04320b3d-bb85-4a09-ab7d-5fdc01962b73","Type":"ContainerStarted","Data":"aec212c12f46d47752eb573e3fc54765847b8990bf589087e31c95545c824ace"} Apr 22 17:37:40.842194 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:40.842159 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:37:41.911734 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.911662 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-whsmr"] Apr 22 17:37:41.913659 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.913644 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:41.916072 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.916043 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:37:41.916183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.916107 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:37:41.916183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.916053 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-q9862\"" Apr 22 17:37:41.924580 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.924561 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-whsmr"] Apr 22 17:37:41.934065 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.934045 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:37:41.949120 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:41.949088 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66pn\" (UniqueName: \"kubernetes.io/projected/2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f-kube-api-access-l66pn\") pod \"downloads-6bcc868b7-whsmr\" (UID: \"2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f\") " pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:42.011704 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.011679 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xqflj"] Apr 22 17:37:42.013684 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.013669 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.017434 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.017412 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tt9tl\"" Apr 22 17:37:42.017558 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.017438 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:37:42.017558 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.017482 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:37:42.037015 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.036994 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xqflj"] Apr 22 17:37:42.050282 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050264 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca32af5f-5277-45b3-86a5-1640613de6a4-data-volume\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.050380 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050298 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.050380 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050316 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca32af5f-5277-45b3-86a5-1640613de6a4-crio-socket\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.050380 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050336 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l66pn\" (UniqueName: \"kubernetes.io/projected/2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f-kube-api-access-l66pn\") pod \"downloads-6bcc868b7-whsmr\" (UID: \"2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f\") " pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:42.050549 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050410 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca32af5f-5277-45b3-86a5-1640613de6a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.050549 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.050480 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbg84\" (UniqueName: \"kubernetes.io/projected/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-api-access-sbg84\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.091839 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.091817 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66pn\" (UniqueName: \"kubernetes.io/projected/2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f-kube-api-access-l66pn\") pod \"downloads-6bcc868b7-whsmr\" (UID: \"2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f\") " pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:42.151795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.151777 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca32af5f-5277-45b3-86a5-1640613de6a4-data-volume\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.151929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.151818 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.151929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.151846 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca32af5f-5277-45b3-86a5-1640613de6a4-crio-socket\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.151929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.151888 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca32af5f-5277-45b3-86a5-1640613de6a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.152095 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.151993 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca32af5f-5277-45b3-86a5-1640613de6a4-crio-socket\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.152095 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.152047 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbg84\" (UniqueName: \"kubernetes.io/projected/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-api-access-sbg84\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.152186 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.152123 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca32af5f-5277-45b3-86a5-1640613de6a4-data-volume\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.152385 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.152368 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.153985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.153967 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca32af5f-5277-45b3-86a5-1640613de6a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.160492 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.160475 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbg84\" (UniqueName: \"kubernetes.io/projected/ca32af5f-5277-45b3-86a5-1640613de6a4-kube-api-access-sbg84\") pod \"insights-runtime-extractor-xqflj\" (UID: \"ca32af5f-5277-45b3-86a5-1640613de6a4\") " pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.223021 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.223005 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:42.322278 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.322255 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xqflj" Apr 22 17:37:42.335933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.335891 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-whsmr"] Apr 22 17:37:42.341595 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:42.341564 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e5aae8b_ae1d_46c1_a0ae_77cb6bbe014f.slice/crio-222f1982373f212b16cf4f76a7cd49af308ceaf766169f7b90d7ab01a2c98c82 WatchSource:0}: Error finding container 222f1982373f212b16cf4f76a7cd49af308ceaf766169f7b90d7ab01a2c98c82: Status 404 returned error can't find the container with id 222f1982373f212b16cf4f76a7cd49af308ceaf766169f7b90d7ab01a2c98c82 Apr 22 17:37:42.411063 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.411031 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s6r9h" event={"ID":"04320b3d-bb85-4a09-ab7d-5fdc01962b73","Type":"ContainerStarted","Data":"7c309df8a0c1ef9aa8502b6b38c5df43361dd875f4114f197931222fec8605d3"} Apr 22 17:37:42.412175 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.412152 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-whsmr" event={"ID":"2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f","Type":"ContainerStarted","Data":"222f1982373f212b16cf4f76a7cd49af308ceaf766169f7b90d7ab01a2c98c82"} Apr 22 17:37:42.427338 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.427299 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s6r9h" podStartSLOduration=137.922130978 podStartE2EDuration="2m19.427285759s" podCreationTimestamp="2026-04-22 17:35:23 +0000 UTC" firstStartedPulling="2026-04-22 17:37:39.978481754 +0000 UTC m=+168.734566305" lastFinishedPulling="2026-04-22 17:37:41.483636535 +0000 UTC m=+170.239721086" observedRunningTime="2026-04-22 17:37:42.427147162 +0000 UTC m=+171.183231735" watchObservedRunningTime="2026-04-22 17:37:42.427285759 +0000 UTC m=+171.183370335" Apr 22 17:37:42.435181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:42.435157 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xqflj"] Apr 22 17:37:42.438820 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:42.438796 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca32af5f_5277_45b3_86a5_1640613de6a4.slice/crio-b95903ece02c4e995e6f48ff011f1ae966569f9bda3662de2a7ad6d9ef1ab879 WatchSource:0}: Error finding container b95903ece02c4e995e6f48ff011f1ae966569f9bda3662de2a7ad6d9ef1ab879: Status 404 returned error can't find the container with id b95903ece02c4e995e6f48ff011f1ae966569f9bda3662de2a7ad6d9ef1ab879 Apr 22 17:37:43.261465 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.261420 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:43.261929 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.261489 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:43.262328 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.262302 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/961e6502-070a-4f99-9ed6-a2a445aefbb3-service-ca-bundle\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:43.264433 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.264393 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/961e6502-070a-4f99-9ed6-a2a445aefbb3-metrics-certs\") pod \"router-default-7b7cbdcdbb-cllk8\" (UID: \"961e6502-070a-4f99-9ed6-a2a445aefbb3\") " pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:43.416788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.416688 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xqflj" event={"ID":"ca32af5f-5277-45b3-86a5-1640613de6a4","Type":"ContainerStarted","Data":"ddac91161ea11f28931b658896c488b51d9f9d002553bf6eaba0f3a4756c5bae"} Apr 22 17:37:43.416788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.416742 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xqflj" event={"ID":"ca32af5f-5277-45b3-86a5-1640613de6a4","Type":"ContainerStarted","Data":"cf971ced6417e12cd077b3346fe4d582ad03a66b8978981e63e4b09719baf01c"} Apr 22 17:37:43.416788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.416755 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xqflj" event={"ID":"ca32af5f-5277-45b3-86a5-1640613de6a4","Type":"ContainerStarted","Data":"b95903ece02c4e995e6f48ff011f1ae966569f9bda3662de2a7ad6d9ef1ab879"} Apr 22 17:37:43.493182 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.493144 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:43.641423 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:43.641362 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b7cbdcdbb-cllk8"] Apr 22 17:37:43.645599 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:43.644952 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod961e6502_070a_4f99_9ed6_a2a445aefbb3.slice/crio-6eeb651b00facbf5443b87046b139b7f83b4d32fbd73eea77d9e4e7d27a0edb3 WatchSource:0}: Error finding container 6eeb651b00facbf5443b87046b139b7f83b4d32fbd73eea77d9e4e7d27a0edb3: Status 404 returned error can't find the container with id 6eeb651b00facbf5443b87046b139b7f83b4d32fbd73eea77d9e4e7d27a0edb3 Apr 22 17:37:44.391244 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.391156 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9z52f" Apr 22 17:37:44.422227 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.422194 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" event={"ID":"961e6502-070a-4f99-9ed6-a2a445aefbb3","Type":"ContainerStarted","Data":"458ecdbc1f63dbc8d6f297218b9c67355ffe89e82df7f24ce38f34ae84f728f8"} Apr 22 17:37:44.422374 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.422237 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" event={"ID":"961e6502-070a-4f99-9ed6-a2a445aefbb3","Type":"ContainerStarted","Data":"6eeb651b00facbf5443b87046b139b7f83b4d32fbd73eea77d9e4e7d27a0edb3"} Apr 22 17:37:44.493614 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.493582 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:44.496622 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.496598 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:44.516566 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:44.516511 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" podStartSLOduration=33.516497195 podStartE2EDuration="33.516497195s" podCreationTimestamp="2026-04-22 17:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:37:44.443895673 +0000 UTC m=+173.199980248" watchObservedRunningTime="2026-04-22 17:37:44.516497195 +0000 UTC m=+173.272581768" Apr 22 17:37:45.426273 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.426239 2539 generic.go:358] "Generic (PLEG): container finished" podID="ed3533db-96b9-4be4-aada-e5abba0de7b6" containerID="10561562a24b32cc0c1759c65351d93c84b93ef85a3e7ea9fd5b27ae4de5c889" exitCode=255 Apr 22 17:37:45.426752 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.426310 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" event={"ID":"ed3533db-96b9-4be4-aada-e5abba0de7b6","Type":"ContainerDied","Data":"10561562a24b32cc0c1759c65351d93c84b93ef85a3e7ea9fd5b27ae4de5c889"} Apr 22 17:37:45.429859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.429712 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xqflj" event={"ID":"ca32af5f-5277-45b3-86a5-1640613de6a4","Type":"ContainerStarted","Data":"7d3941639a9627d91564537759a3796dee05f3f1ecad5970fb3962336ef124e5"} Apr 22 17:37:45.430093 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.430073 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:45.431476 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.431454 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b7cbdcdbb-cllk8" Apr 22 17:37:45.434824 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.434798 2539 scope.go:117] "RemoveContainer" containerID="10561562a24b32cc0c1759c65351d93c84b93ef85a3e7ea9fd5b27ae4de5c889" Apr 22 17:37:45.493277 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:45.493229 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xqflj" podStartSLOduration=2.448696479 podStartE2EDuration="4.493213447s" podCreationTimestamp="2026-04-22 17:37:41 +0000 UTC" firstStartedPulling="2026-04-22 17:37:42.4862443 +0000 UTC m=+171.242328851" lastFinishedPulling="2026-04-22 17:37:44.530761264 +0000 UTC m=+173.286845819" observedRunningTime="2026-04-22 17:37:45.492773359 +0000 UTC m=+174.248857934" watchObservedRunningTime="2026-04-22 17:37:45.493213447 +0000 UTC m=+174.249298020" Apr 22 17:37:46.433710 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:46.433668 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65557956df-xtp2r" event={"ID":"ed3533db-96b9-4be4-aada-e5abba0de7b6","Type":"ContainerStarted","Data":"90b04956d3847024c4d889827c65d0173de090a6b6e24d7810fb6794763cd130"} Apr 22 17:37:51.939915 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:51.939869 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:37:52.899761 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.899719 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6mhcz"] Apr 22 17:37:52.902892 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.902865 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7xrr6"] Apr 22 17:37:52.903056 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.903034 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.908681 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.908659 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 17:37:52.909186 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.908820 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 17:37:52.909758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.908860 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2j5ws\"" Apr 22 17:37:52.909758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.908887 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:37:52.909758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.908983 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:37:52.909758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.909729 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.909758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.909081 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:37:52.910091 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.909111 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:37:52.913172 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.913041 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:37:52.913172 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.913054 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:37:52.913172 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.913085 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8rhq9\"" Apr 22 17:37:52.913554 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.913530 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6mhcz"] Apr 22 17:37:52.913639 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.913561 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:37:52.945418 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945381 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-textfile\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945420 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-metrics-client-ca\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945448 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-sys\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945517 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945557 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-tls\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945597 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb7sb\" (UniqueName: \"kubernetes.io/projected/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-api-access-mb7sb\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945626 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945694 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.945766 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945742 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945783 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-root\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945817 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945848 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm2g7\" (UniqueName: \"kubernetes.io/projected/715f3b4b-86d6-4cef-adfe-80103d666f2e-kube-api-access-fm2g7\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945877 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95eeecd5-1c23-4cde-b52e-0531263ac4bd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945940 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-wtmp\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:52.946181 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:52.945974 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.046933 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.046889 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.046944 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-tls\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.046977 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb7sb\" (UniqueName: \"kubernetes.io/projected/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-api-access-mb7sb\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.046999 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047025 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047054 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047106 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047086 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-root\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047117 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047148 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm2g7\" (UniqueName: \"kubernetes.io/projected/715f3b4b-86d6-4cef-adfe-80103d666f2e-kube-api-access-fm2g7\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047178 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95eeecd5-1c23-4cde-b52e-0531263ac4bd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047209 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-wtmp\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047239 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047289 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-textfile\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047447 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047313 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-metrics-client-ca\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047767 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047588 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-sys\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047767 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.047698 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-sys\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.047892 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:53.047782 2539 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 17:37:53.047892 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:53.047847 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls podName:95eeecd5-1c23-4cde-b52e-0531263ac4bd nodeName:}" failed. No retries permitted until 2026-04-22 17:37:53.547827325 +0000 UTC m=+182.303911949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-6mhcz" (UID: "95eeecd5-1c23-4cde-b52e-0531263ac4bd") : secret "kube-state-metrics-tls" not found Apr 22 17:37:53.048055 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048032 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-root\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.048548 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048132 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.048548 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048138 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/95eeecd5-1c23-4cde-b52e-0531263ac4bd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.048548 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048326 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-wtmp\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.048814 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048662 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.048814 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048716 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-metrics-client-ca\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.048814 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048731 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.049006 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.048849 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-textfile\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.050143 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.050120 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.050369 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.050347 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.050872 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.050828 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/715f3b4b-86d6-4cef-adfe-80103d666f2e-node-exporter-tls\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.058866 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.058836 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb7sb\" (UniqueName: \"kubernetes.io/projected/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-api-access-mb7sb\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.059023 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.058986 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm2g7\" (UniqueName: \"kubernetes.io/projected/715f3b4b-86d6-4cef-adfe-80103d666f2e-kube-api-access-fm2g7\") pod \"node-exporter-7xrr6\" (UID: \"715f3b4b-86d6-4cef-adfe-80103d666f2e\") " pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.227507 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.227470 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7xrr6" Apr 22 17:37:53.552567 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.552482 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.555281 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.555252 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/95eeecd5-1c23-4cde-b52e-0531263ac4bd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6mhcz\" (UID: \"95eeecd5-1c23-4cde-b52e-0531263ac4bd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.822546 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.822455 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" Apr 22 17:37:53.976803 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.976765 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:53.979773 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.979753 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:53.982226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982197 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:37:53.982226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982222 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:37:53.982386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982197 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:37:53.982386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982206 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:37:53.982516 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982494 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:37:53.982638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982565 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:37:53.982638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982577 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:37:53.982763 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982729 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:37:53.982819 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982791 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:37:53.982872 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.982863 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vln98\"" Apr 22 17:37:53.993604 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:53.993580 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:54.057411 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057368 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057413 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057455 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057481 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7klj\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057509 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057562 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057607 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057652 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057686 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057736 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057771 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057791 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.057838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.057833 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158612 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158521 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158612 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158591 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158646 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158683 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158713 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.158843 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158737 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158879 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158949 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.158992 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.159011 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.159026 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.159098 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.159145 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7klj\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.159178 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.159313 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:54.159302 2539 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 17:37:54.159500 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:54.159371 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls podName:d9605e64-41c6-4052-ad0f-04c386fbe607 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:54.659351977 +0000 UTC m=+183.415436529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607") : secret "alertmanager-main-tls" not found Apr 22 17:37:54.159880 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:54.159697 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle podName:d9605e64-41c6-4052-ad0f-04c386fbe607 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:54.659677915 +0000 UTC m=+183.415762481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607") : configmap references non-existent config key: ca-bundle.crt Apr 22 17:37:54.160258 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.160032 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.162569 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.162533 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.162663 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.162639 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.162812 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.162774 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.163149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.163087 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.163249 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.163195 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.163249 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.163232 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.163647 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.163622 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.164360 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.164321 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.168664 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.168641 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7klj\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.664067 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.664024 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.664239 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.664151 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.664950 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.664924 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.666918 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.666868 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:54.892058 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:54.892021 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:37:57.701920 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.701863 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5"] Apr 22 17:37:57.705714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.705686 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:57.708263 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.708231 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 17:37:57.708263 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.708244 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-lj9tw\"" Apr 22 17:37:57.715890 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.715860 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5"] Apr 22 17:37:57.739580 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.739552 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:37:57.742045 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.742022 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.744931 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.744886 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:37:57.745481 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.745455 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:37:57.745776 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.745756 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:37:57.746457 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.746432 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:37:57.746637 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.746619 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:37:57.747268 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.746790 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-m79jb\"" Apr 22 17:37:57.752164 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.752141 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:37:57.752934 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.752910 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:37:57.797325 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797293 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-68nx5\" (UID: \"0091f08a-b4fc-4921-b944-8ec2bf7919f7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:57.797325 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797336 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797362 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797439 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797457 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865vl\" (UniqueName: \"kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797485 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797648 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797540 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.797648 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.797571 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.835089 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:57.835029 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod715f3b4b_86d6_4cef_adfe_80103d666f2e.slice/crio-e465bc19da547415feaf969a464bd20f592db34af640116f6e7d431ec00910ee WatchSource:0}: Error finding container e465bc19da547415feaf969a464bd20f592db34af640116f6e7d431ec00910ee: Status 404 returned error can't find the container with id e465bc19da547415feaf969a464bd20f592db34af640116f6e7d431ec00910ee Apr 22 17:37:57.898472 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898440 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898502 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-865vl\" (UniqueName: \"kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898543 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898590 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898749 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898619 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898749 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898674 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-68nx5\" (UID: \"0091f08a-b4fc-4921-b944-8ec2bf7919f7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:57.898749 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898705 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.898749 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.898730 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.900752 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:57.899215 2539 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 17:37:57.900752 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:37:57.899285 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert podName:0091f08a-b4fc-4921-b944-8ec2bf7919f7 nodeName:}" failed. No retries permitted until 2026-04-22 17:37:58.39926427 +0000 UTC m=+187.155348834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-68nx5" (UID: "0091f08a-b4fc-4921-b944-8ec2bf7919f7") : secret "monitoring-plugin-cert" not found Apr 22 17:37:57.900752 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.899509 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.900752 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.900706 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.901889 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.901833 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.902426 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.902328 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.903014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.902973 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.904825 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.904784 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.916758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.916728 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-865vl\" (UniqueName: \"kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl\") pod \"console-b454bf58f-nq8pc\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:57.960445 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.960418 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:37:57.963668 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:57.963641 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9605e64_41c6_4052_ad0f_04c386fbe607.slice/crio-d4936663e35508deed327e06bddece70bcc87e00d0594566afce7bd0f410432c WatchSource:0}: Error finding container d4936663e35508deed327e06bddece70bcc87e00d0594566afce7bd0f410432c: Status 404 returned error can't find the container with id d4936663e35508deed327e06bddece70bcc87e00d0594566afce7bd0f410432c Apr 22 17:37:57.973226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:57.973204 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6mhcz"] Apr 22 17:37:57.977402 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:57.977377 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95eeecd5_1c23_4cde_b52e_0531263ac4bd.slice/crio-9f0c8ab9ccb6f1b7d9ffb97f76aa47d61d23ba95cf34755b63d3e251d1ce3f17 WatchSource:0}: Error finding container 9f0c8ab9ccb6f1b7d9ffb97f76aa47d61d23ba95cf34755b63d3e251d1ce3f17: Status 404 returned error can't find the container with id 9f0c8ab9ccb6f1b7d9ffb97f76aa47d61d23ba95cf34755b63d3e251d1ce3f17 Apr 22 17:37:58.053951 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.053917 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:37:58.192009 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.191952 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:37:58.195712 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:58.195664 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf333590_705e_42e7_b12f_0ae205ffccfa.slice/crio-f8059d30dd1b67bde1c8eb718d4d10dc5d81bde7888438ffc77acfe5e0fe741c WatchSource:0}: Error finding container f8059d30dd1b67bde1c8eb718d4d10dc5d81bde7888438ffc77acfe5e0fe741c: Status 404 returned error can't find the container with id f8059d30dd1b67bde1c8eb718d4d10dc5d81bde7888438ffc77acfe5e0fe741c Apr 22 17:37:58.405431 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.405306 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-68nx5\" (UID: \"0091f08a-b4fc-4921-b944-8ec2bf7919f7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:58.408959 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.408867 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0091f08a-b4fc-4921-b944-8ec2bf7919f7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-68nx5\" (UID: \"0091f08a-b4fc-4921-b944-8ec2bf7919f7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:58.471846 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.471789 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b454bf58f-nq8pc" event={"ID":"bf333590-705e-42e7-b12f-0ae205ffccfa","Type":"ContainerStarted","Data":"f8059d30dd1b67bde1c8eb718d4d10dc5d81bde7888438ffc77acfe5e0fe741c"} Apr 22 17:37:58.473351 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.473292 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7xrr6" event={"ID":"715f3b4b-86d6-4cef-adfe-80103d666f2e","Type":"ContainerStarted","Data":"e465bc19da547415feaf969a464bd20f592db34af640116f6e7d431ec00910ee"} Apr 22 17:37:58.475841 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.475813 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"d4936663e35508deed327e06bddece70bcc87e00d0594566afce7bd0f410432c"} Apr 22 17:37:58.477827 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.477804 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-whsmr" event={"ID":"2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f","Type":"ContainerStarted","Data":"ff0ab784f929203362399c4fd8a93dd21df5ad0dada5a18315e356d35dced02e"} Apr 22 17:37:58.479048 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.479027 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:58.480917 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.480878 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" event={"ID":"95eeecd5-1c23-4cde-b52e-0531263ac4bd","Type":"ContainerStarted","Data":"9f0c8ab9ccb6f1b7d9ffb97f76aa47d61d23ba95cf34755b63d3e251d1ce3f17"} Apr 22 17:37:58.489873 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.489823 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-whsmr" Apr 22 17:37:58.526876 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.526809 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-whsmr" podStartSLOduration=1.977665249 podStartE2EDuration="17.526791021s" podCreationTimestamp="2026-04-22 17:37:41 +0000 UTC" firstStartedPulling="2026-04-22 17:37:42.34374621 +0000 UTC m=+171.099830764" lastFinishedPulling="2026-04-22 17:37:57.892871984 +0000 UTC m=+186.648956536" observedRunningTime="2026-04-22 17:37:58.496486294 +0000 UTC m=+187.252570868" watchObservedRunningTime="2026-04-22 17:37:58.526791021 +0000 UTC m=+187.282875598" Apr 22 17:37:58.617299 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.617047 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:37:58.792114 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:58.792078 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5"] Apr 22 17:37:58.798765 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:37:58.798718 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0091f08a_b4fc_4921_b944_8ec2bf7919f7.slice/crio-6f1d46bd362cd013940db17c928ed1cf93764cc37181ddef2bcf63879f587ac2 WatchSource:0}: Error finding container 6f1d46bd362cd013940db17c928ed1cf93764cc37181ddef2bcf63879f587ac2: Status 404 returned error can't find the container with id 6f1d46bd362cd013940db17c928ed1cf93764cc37181ddef2bcf63879f587ac2 Apr 22 17:37:59.489289 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:59.488951 2539 generic.go:358] "Generic (PLEG): container finished" podID="715f3b4b-86d6-4cef-adfe-80103d666f2e" containerID="16bf5b22e2eba5b470c1a188d8b67e6838b3a56ff6917f39964c8910baff6a1a" exitCode=0 Apr 22 17:37:59.489289 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:59.489047 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7xrr6" event={"ID":"715f3b4b-86d6-4cef-adfe-80103d666f2e","Type":"ContainerDied","Data":"16bf5b22e2eba5b470c1a188d8b67e6838b3a56ff6917f39964c8910baff6a1a"} Apr 22 17:37:59.492188 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:37:59.491857 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" event={"ID":"0091f08a-b4fc-4921-b944-8ec2bf7919f7","Type":"ContainerStarted","Data":"6f1d46bd362cd013940db17c928ed1cf93764cc37181ddef2bcf63879f587ac2"} Apr 22 17:38:00.498755 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.498714 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" event={"ID":"95eeecd5-1c23-4cde-b52e-0531263ac4bd","Type":"ContainerStarted","Data":"4b9e76c5628d257b73fc2c6b9d28e76c4cc93536fc6b92f1731fcdaad0d2f17f"} Apr 22 17:38:00.499221 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.498762 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" event={"ID":"95eeecd5-1c23-4cde-b52e-0531263ac4bd","Type":"ContainerStarted","Data":"9e4676a0ef89d2b920ad5ad5d26bdc4bc122cc6c5091706ded5bde75376f9057"} Apr 22 17:38:00.499221 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.498776 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" event={"ID":"95eeecd5-1c23-4cde-b52e-0531263ac4bd","Type":"ContainerStarted","Data":"e61906e9d57a9b3f747c8be41ecd805bdb1d05b567befc3d8c946aeec89d7ee0"} Apr 22 17:38:00.502156 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.502127 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7xrr6" event={"ID":"715f3b4b-86d6-4cef-adfe-80103d666f2e","Type":"ContainerStarted","Data":"6485eacf8fbad02456b85606bf45889defaf3ce63fff82429d359f8158e9c1c6"} Apr 22 17:38:00.502287 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.502163 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7xrr6" event={"ID":"715f3b4b-86d6-4cef-adfe-80103d666f2e","Type":"ContainerStarted","Data":"da3a186c9e006eedfa891b554b7e04a80111d36ed01df1849ebc59a4dc5b1427"} Apr 22 17:38:00.504226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.504123 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee" exitCode=0 Apr 22 17:38:00.504226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.504156 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee"} Apr 22 17:38:00.517666 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.517260 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-6mhcz" podStartSLOduration=6.598719264 podStartE2EDuration="8.517242414s" podCreationTimestamp="2026-04-22 17:37:52 +0000 UTC" firstStartedPulling="2026-04-22 17:37:57.9795044 +0000 UTC m=+186.735588951" lastFinishedPulling="2026-04-22 17:37:59.898027539 +0000 UTC m=+188.654112101" observedRunningTime="2026-04-22 17:38:00.515679398 +0000 UTC m=+189.271763970" watchObservedRunningTime="2026-04-22 17:38:00.517242414 +0000 UTC m=+189.273326988" Apr 22 17:38:00.564624 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:00.563828 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7xrr6" podStartSLOduration=7.784665662 podStartE2EDuration="8.563811765s" podCreationTimestamp="2026-04-22 17:37:52 +0000 UTC" firstStartedPulling="2026-04-22 17:37:57.836944959 +0000 UTC m=+186.593029512" lastFinishedPulling="2026-04-22 17:37:58.616091064 +0000 UTC m=+187.372175615" observedRunningTime="2026-04-22 17:38:00.537504745 +0000 UTC m=+189.293589319" watchObservedRunningTime="2026-04-22 17:38:00.563811765 +0000 UTC m=+189.319896340" Apr 22 17:38:02.741020 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.740993 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:38:02.757372 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.757340 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:38:02.757712 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.757691 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861465 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvx7f\" (UniqueName: \"kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861539 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861563 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861704 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861779 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861813 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.861974 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.861846 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963178 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963134 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963370 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963298 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963370 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963331 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963370 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963365 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963532 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963414 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvx7f\" (UniqueName: \"kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963532 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963457 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.963532 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.963482 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.964066 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.964022 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.964611 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.964587 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.964721 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.964643 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.964721 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.964704 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.967153 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.967003 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.967966 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.967941 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:02.975016 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:02.974949 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvx7f\" (UniqueName: \"kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f\") pod \"console-55f9bf9b9c-zrz2b\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:03.081362 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.081266 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:03.262127 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.262093 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:38:03.266204 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:38:03.266163 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2beb1ad6_4fd9_482f_830b_c3d7dadc7c01.slice/crio-b768818b935934d8f2d0eadd2efd074877a5439303e05a9d7a451fd5501cef2f WatchSource:0}: Error finding container b768818b935934d8f2d0eadd2efd074877a5439303e05a9d7a451fd5501cef2f: Status 404 returned error can't find the container with id b768818b935934d8f2d0eadd2efd074877a5439303e05a9d7a451fd5501cef2f Apr 22 17:38:03.516626 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.516582 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f9bf9b9c-zrz2b" event={"ID":"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01","Type":"ContainerStarted","Data":"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c"} Apr 22 17:38:03.516626 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.516632 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f9bf9b9c-zrz2b" event={"ID":"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01","Type":"ContainerStarted","Data":"b768818b935934d8f2d0eadd2efd074877a5439303e05a9d7a451fd5501cef2f"} Apr 22 17:38:03.518529 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.518499 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b454bf58f-nq8pc" event={"ID":"bf333590-705e-42e7-b12f-0ae205ffccfa","Type":"ContainerStarted","Data":"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8"} Apr 22 17:38:03.520397 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.520362 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" event={"ID":"0091f08a-b4fc-4921-b944-8ec2bf7919f7","Type":"ContainerStarted","Data":"c291d03d2538d03cbc1120025a93029f1edad8159c728ce7dee88d219090e30b"} Apr 22 17:38:03.520930 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.520858 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:38:03.526740 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.526671 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" Apr 22 17:38:03.539393 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.539340 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55f9bf9b9c-zrz2b" podStartSLOduration=1.539323783 podStartE2EDuration="1.539323783s" podCreationTimestamp="2026-04-22 17:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:38:03.538324185 +0000 UTC m=+192.294408781" watchObservedRunningTime="2026-04-22 17:38:03.539323783 +0000 UTC m=+192.295408357" Apr 22 17:38:03.557986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.557937 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b454bf58f-nq8pc" podStartSLOduration=2.022908337 podStartE2EDuration="6.557924944s" podCreationTimestamp="2026-04-22 17:37:57 +0000 UTC" firstStartedPulling="2026-04-22 17:37:58.19887966 +0000 UTC m=+186.954964212" lastFinishedPulling="2026-04-22 17:38:02.733896235 +0000 UTC m=+191.489980819" observedRunningTime="2026-04-22 17:38:03.557064292 +0000 UTC m=+192.313148888" watchObservedRunningTime="2026-04-22 17:38:03.557924944 +0000 UTC m=+192.314009514" Apr 22 17:38:03.572504 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:03.572447 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-68nx5" podStartSLOduration=2.63624767 podStartE2EDuration="6.572432s" podCreationTimestamp="2026-04-22 17:37:57 +0000 UTC" firstStartedPulling="2026-04-22 17:37:58.801295821 +0000 UTC m=+187.557380387" lastFinishedPulling="2026-04-22 17:38:02.737480123 +0000 UTC m=+191.493564717" observedRunningTime="2026-04-22 17:38:03.571814242 +0000 UTC m=+192.327898832" watchObservedRunningTime="2026-04-22 17:38:03.572432 +0000 UTC m=+192.328516576" Apr 22 17:38:04.526799 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:04.526759 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f"} Apr 22 17:38:05.536889 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:05.536809 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253"} Apr 22 17:38:05.536889 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:05.536858 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c"} Apr 22 17:38:05.536889 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:05.536872 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa"} Apr 22 17:38:05.536889 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:05.536886 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74"} Apr 22 17:38:06.544425 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:06.544385 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerStarted","Data":"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c"} Apr 22 17:38:06.575778 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:06.575682 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=5.235736039 podStartE2EDuration="13.575665936s" podCreationTimestamp="2026-04-22 17:37:53 +0000 UTC" firstStartedPulling="2026-04-22 17:37:57.965514526 +0000 UTC m=+186.721599077" lastFinishedPulling="2026-04-22 17:38:06.305444419 +0000 UTC m=+195.061528974" observedRunningTime="2026-04-22 17:38:06.572989559 +0000 UTC m=+195.329074145" watchObservedRunningTime="2026-04-22 17:38:06.575665936 +0000 UTC m=+195.331750509" Apr 22 17:38:06.952709 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:06.952664 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" podUID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" containerName="registry" containerID="cri-o://f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625" gracePeriod=30 Apr 22 17:38:07.242786 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.242761 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:38:07.404462 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404425 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404483 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404508 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404546 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404638 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404594 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404852 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404646 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404852 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404676 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sx5k\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.404852 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.404721 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") pod \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\" (UID: \"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4\") " Apr 22 17:38:07.405097 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.405058 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:07.405330 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.405298 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:07.407266 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.407237 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:07.407369 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.407303 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:07.407369 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.407303 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:07.407466 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.407368 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:07.408002 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.407959 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k" (OuterVolumeSpecName: "kube-api-access-9sx5k") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "kube-api-access-9sx5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:07.416148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.416122 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" (UID: "02ac9a15-99e8-4bc0-bb94-79c3df1bdde4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:38:07.505874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505784 2539 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-trusted-ca\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.505874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505825 2539 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-certificates\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.505874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505843 2539 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-image-registry-private-configuration\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.505874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505863 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9sx5k\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-kube-api-access-9sx5k\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.505874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505874 2539 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-registry-tls\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.506203 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505884 2539 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-bound-sa-token\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.506203 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505893 2539 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-ca-trust-extracted\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.506203 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.505917 2539 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4-installation-pull-secrets\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:07.549118 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.549076 2539 generic.go:358] "Generic (PLEG): container finished" podID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" containerID="f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625" exitCode=0 Apr 22 17:38:07.549568 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.549139 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" Apr 22 17:38:07.549568 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.549162 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" event={"ID":"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4","Type":"ContainerDied","Data":"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625"} Apr 22 17:38:07.549568 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.549201 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c98b6bc6c-j57rh" event={"ID":"02ac9a15-99e8-4bc0-bb94-79c3df1bdde4","Type":"ContainerDied","Data":"f5f5d843190c30c7afe880422d38d322343608f0331a164ba68d950f6638588b"} Apr 22 17:38:07.549568 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.549219 2539 scope.go:117] "RemoveContainer" containerID="f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625" Apr 22 17:38:07.561301 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.559545 2539 scope.go:117] "RemoveContainer" containerID="f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625" Apr 22 17:38:07.561301 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:38:07.561034 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625\": container with ID starting with f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625 not found: ID does not exist" containerID="f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625" Apr 22 17:38:07.561301 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.561072 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625"} err="failed to get container status \"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625\": rpc error: code = NotFound desc = could not find container \"f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625\": container with ID starting with f02f551fc8b2648dd243e1e47390e03e39380ac696159e78e83d3343145d3625 not found: ID does not exist" Apr 22 17:38:07.572463 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.572437 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:38:07.575362 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.575336 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c98b6bc6c-j57rh"] Apr 22 17:38:07.845822 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:07.845741 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" path="/var/lib/kubelet/pods/02ac9a15-99e8-4bc0-bb94-79c3df1bdde4/volumes" Apr 22 17:38:08.054620 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:08.054576 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:08.054823 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:08.054743 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:08.061296 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:08.061266 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:08.557033 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:08.557006 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:13.081441 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:13.081407 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:13.081882 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:13.081455 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:13.086169 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:13.086146 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:13.570412 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:13.570368 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:38:13.616961 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:13.616933 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:38:30.619417 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:30.619385 2539 generic.go:358] "Generic (PLEG): container finished" podID="65f4e6af-46be-4594-b3fb-54dd4fac761b" containerID="b3a346e683256e4f9df9c048d56b4cdcca3d93ec4b78de61f1471b3c560f1522" exitCode=0 Apr 22 17:38:30.619749 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:30.619459 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t45p4" event={"ID":"65f4e6af-46be-4594-b3fb-54dd4fac761b","Type":"ContainerDied","Data":"b3a346e683256e4f9df9c048d56b4cdcca3d93ec4b78de61f1471b3c560f1522"} Apr 22 17:38:30.619789 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:30.619764 2539 scope.go:117] "RemoveContainer" containerID="b3a346e683256e4f9df9c048d56b4cdcca3d93ec4b78de61f1471b3c560f1522" Apr 22 17:38:31.623666 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:31.623634 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-t45p4" event={"ID":"65f4e6af-46be-4594-b3fb-54dd4fac761b","Type":"ContainerStarted","Data":"db6901761ade38dd8a6a5f01b4862af60a674af535d993859fd7667bddf6c395"} Apr 22 17:38:38.635748 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.635685 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b454bf58f-nq8pc" podUID="bf333590-705e-42e7-b12f-0ae205ffccfa" containerName="console" containerID="cri-o://ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8" gracePeriod=15 Apr 22 17:38:38.914266 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.914243 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b454bf58f-nq8pc_bf333590-705e-42e7-b12f-0ae205ffccfa/console/0.log" Apr 22 17:38:38.914368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.914307 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:38.965661 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965632 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965796 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965671 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965796 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965697 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965796 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965746 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965944 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965860 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965999 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965941 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-865vl\" (UniqueName: \"kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.965999 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.965967 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config\") pod \"bf333590-705e-42e7-b12f-0ae205ffccfa\" (UID: \"bf333590-705e-42e7-b12f-0ae205ffccfa\") " Apr 22 17:38:38.966177 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966153 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:38.966270 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966183 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca" (OuterVolumeSpecName: "service-ca") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:38.966270 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966198 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config" (OuterVolumeSpecName: "console-config") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:38.966270 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966208 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:38:38.966395 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966298 2539 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-trusted-ca-bundle\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:38.966395 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966317 2539 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-service-ca\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:38.966395 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966332 2539 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-oauth-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:38.966395 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.966347 2539 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf333590-705e-42e7-b12f-0ae205ffccfa-console-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:38.968158 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.968130 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:38.968245 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.968190 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:38:38.968245 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:38.968193 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl" (OuterVolumeSpecName: "kube-api-access-865vl") pod "bf333590-705e-42e7-b12f-0ae205ffccfa" (UID: "bf333590-705e-42e7-b12f-0ae205ffccfa"). InnerVolumeSpecName "kube-api-access-865vl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:38:39.066985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.066940 2539 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:39.066985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.066977 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-865vl\" (UniqueName: \"kubernetes.io/projected/bf333590-705e-42e7-b12f-0ae205ffccfa-kube-api-access-865vl\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:39.066985 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.066988 2539 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf333590-705e-42e7-b12f-0ae205ffccfa-console-oauth-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:38:39.646237 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646208 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b454bf58f-nq8pc_bf333590-705e-42e7-b12f-0ae205ffccfa/console/0.log" Apr 22 17:38:39.646708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646253 2539 generic.go:358] "Generic (PLEG): container finished" podID="bf333590-705e-42e7-b12f-0ae205ffccfa" containerID="ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8" exitCode=2 Apr 22 17:38:39.646708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646328 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b454bf58f-nq8pc" event={"ID":"bf333590-705e-42e7-b12f-0ae205ffccfa","Type":"ContainerDied","Data":"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8"} Apr 22 17:38:39.646708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646337 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b454bf58f-nq8pc" Apr 22 17:38:39.646708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646367 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b454bf58f-nq8pc" event={"ID":"bf333590-705e-42e7-b12f-0ae205ffccfa","Type":"ContainerDied","Data":"f8059d30dd1b67bde1c8eb718d4d10dc5d81bde7888438ffc77acfe5e0fe741c"} Apr 22 17:38:39.646708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.646389 2539 scope.go:117] "RemoveContainer" containerID="ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8" Apr 22 17:38:39.654563 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.654545 2539 scope.go:117] "RemoveContainer" containerID="ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8" Apr 22 17:38:39.654823 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:38:39.654801 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8\": container with ID starting with ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8 not found: ID does not exist" containerID="ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8" Apr 22 17:38:39.654871 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.654831 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8"} err="failed to get container status \"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8\": rpc error: code = NotFound desc = could not find container \"ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8\": container with ID starting with ebc57988ca28e2524ff0d80dbdccb5ab8ae5375907d5f0cde26f46928710e2c8 not found: ID does not exist" Apr 22 17:38:39.667500 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.667474 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:38:39.671361 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.671341 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b454bf58f-nq8pc"] Apr 22 17:38:39.846662 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:39.846624 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf333590-705e-42e7-b12f-0ae205ffccfa" path="/var/lib/kubelet/pods/bf333590-705e-42e7-b12f-0ae205ffccfa/volumes" Apr 22 17:38:41.654835 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:41.654805 2539 generic.go:358] "Generic (PLEG): container finished" podID="2f845b61-dcdc-4d78-b24e-77b96b4792a1" containerID="5a8d6856e762d67f240bd1c3f69d4c526b200aa4059b157b02121e4751791f96" exitCode=0 Apr 22 17:38:41.655241 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:41.654873 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" event={"ID":"2f845b61-dcdc-4d78-b24e-77b96b4792a1","Type":"ContainerDied","Data":"5a8d6856e762d67f240bd1c3f69d4c526b200aa4059b157b02121e4751791f96"} Apr 22 17:38:41.655283 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:41.655246 2539 scope.go:117] "RemoveContainer" containerID="5a8d6856e762d67f240bd1c3f69d4c526b200aa4059b157b02121e4751791f96" Apr 22 17:38:42.661859 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:38:42.661822 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gcd2q" event={"ID":"2f845b61-dcdc-4d78-b24e-77b96b4792a1","Type":"ContainerStarted","Data":"8a91c15d94c2b01cbe5c3d78e2ee5a1932ea301ef75850d985e2835e5fd04da1"} Apr 22 17:39:02.580623 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:02.580575 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:39:02.583353 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:02.583326 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a15342ff-f78f-4d33-aed1-0e9c86dbdb15-metrics-certs\") pod \"network-metrics-daemon-8bxsz\" (UID: \"a15342ff-f78f-4d33-aed1-0e9c86dbdb15\") " pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:39:02.745280 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:02.745247 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qs9jr\"" Apr 22 17:39:02.753252 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:02.753229 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8bxsz" Apr 22 17:39:02.873555 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:02.873475 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8bxsz"] Apr 22 17:39:02.876310 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:39:02.876273 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15342ff_f78f_4d33_aed1_0e9c86dbdb15.slice/crio-33cb3ee54afe7b2550d1eb2e7ec22cca6f653ce9de6d9e1c27d85c89af6a1648 WatchSource:0}: Error finding container 33cb3ee54afe7b2550d1eb2e7ec22cca6f653ce9de6d9e1c27d85c89af6a1648: Status 404 returned error can't find the container with id 33cb3ee54afe7b2550d1eb2e7ec22cca6f653ce9de6d9e1c27d85c89af6a1648 Apr 22 17:39:03.727013 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:03.726968 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bxsz" event={"ID":"a15342ff-f78f-4d33-aed1-0e9c86dbdb15","Type":"ContainerStarted","Data":"33cb3ee54afe7b2550d1eb2e7ec22cca6f653ce9de6d9e1c27d85c89af6a1648"} Apr 22 17:39:04.731355 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:04.731319 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bxsz" event={"ID":"a15342ff-f78f-4d33-aed1-0e9c86dbdb15","Type":"ContainerStarted","Data":"0156d2ed4038675565edcf061092cef0967681ef5ea9ea32799ac08162065c08"} Apr 22 17:39:04.731355 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:04.731358 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8bxsz" event={"ID":"a15342ff-f78f-4d33-aed1-0e9c86dbdb15","Type":"ContainerStarted","Data":"984044d15de25478017a74dff335ff53246b30310c81b61c9c0f6bb9c534f2cb"} Apr 22 17:39:04.750453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:04.750410 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8bxsz" podStartSLOduration=252.804801358 podStartE2EDuration="4m13.750392823s" podCreationTimestamp="2026-04-22 17:34:51 +0000 UTC" firstStartedPulling="2026-04-22 17:39:02.878135696 +0000 UTC m=+251.634220250" lastFinishedPulling="2026-04-22 17:39:03.82372716 +0000 UTC m=+252.579811715" observedRunningTime="2026-04-22 17:39:04.749712811 +0000 UTC m=+253.505797386" watchObservedRunningTime="2026-04-22 17:39:04.750392823 +0000 UTC m=+253.506477390" Apr 22 17:39:09.630567 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.630528 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.630951 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" containerName="registry" Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.630972 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" containerName="registry" Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.630990 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf333590-705e-42e7-b12f-0ae205ffccfa" containerName="console" Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.630996 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf333590-705e-42e7-b12f-0ae205ffccfa" containerName="console" Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.631064 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf333590-705e-42e7-b12f-0ae205ffccfa" containerName="console" Apr 22 17:39:09.631148 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.631077 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="02ac9a15-99e8-4bc0-bb94-79c3df1bdde4" containerName="registry" Apr 22 17:39:09.634600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.634570 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.649806 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.649782 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:39:09.733013 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.732987 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733146 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733034 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733146 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733051 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733146 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733116 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5mzc\" (UniqueName: \"kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733307 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733156 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733307 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733228 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.733307 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.733275 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834561 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834535 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834579 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834597 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834626 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5mzc\" (UniqueName: \"kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834648 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834672 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.834976 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.834695 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.835483 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.835459 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.835585 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.835552 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.835585 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.835569 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.835700 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.835619 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.837117 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.837099 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.837193 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.837157 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.842538 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.842520 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5mzc\" (UniqueName: \"kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc\") pod \"console-75668dc9d-clpwt\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:09.944711 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:09.944656 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:10.059046 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:10.059022 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:39:10.061475 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:39:10.061447 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb72c82_6907_4043_bea9_b3a01212c1e8.slice/crio-1026883d6340f5317105da694c816f6ed520a3e5d31a86f57d3d2f11d0fe3223 WatchSource:0}: Error finding container 1026883d6340f5317105da694c816f6ed520a3e5d31a86f57d3d2f11d0fe3223: Status 404 returned error can't find the container with id 1026883d6340f5317105da694c816f6ed520a3e5d31a86f57d3d2f11d0fe3223 Apr 22 17:39:10.749972 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:10.749863 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75668dc9d-clpwt" event={"ID":"beb72c82-6907-4043-bea9-b3a01212c1e8","Type":"ContainerStarted","Data":"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f"} Apr 22 17:39:10.749972 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:10.749917 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75668dc9d-clpwt" event={"ID":"beb72c82-6907-4043-bea9-b3a01212c1e8","Type":"ContainerStarted","Data":"1026883d6340f5317105da694c816f6ed520a3e5d31a86f57d3d2f11d0fe3223"} Apr 22 17:39:10.768248 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:10.768191 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75668dc9d-clpwt" podStartSLOduration=1.768174474 podStartE2EDuration="1.768174474s" podCreationTimestamp="2026-04-22 17:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:39:10.76652695 +0000 UTC m=+259.522611523" watchObservedRunningTime="2026-04-22 17:39:10.768174474 +0000 UTC m=+259.524259050" Apr 22 17:39:13.189141 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189103 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:13.189625 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189529 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="alertmanager" containerID="cri-o://9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f" gracePeriod=120 Apr 22 17:39:13.189625 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189598 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-metric" containerID="cri-o://fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253" gracePeriod=120 Apr 22 17:39:13.189737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189623 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="config-reloader" containerID="cri-o://4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74" gracePeriod=120 Apr 22 17:39:13.189737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189653 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy" containerID="cri-o://c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c" gracePeriod=120 Apr 22 17:39:13.189737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189666 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="prom-label-proxy" containerID="cri-o://0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c" gracePeriod=120 Apr 22 17:39:13.189737 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.189624 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-web" containerID="cri-o://346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa" gracePeriod=120 Apr 22 17:39:13.763708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763672 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c" exitCode=0 Apr 22 17:39:13.763708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763701 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253" exitCode=0 Apr 22 17:39:13.763708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763707 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c" exitCode=0 Apr 22 17:39:13.763708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763712 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74" exitCode=0 Apr 22 17:39:13.763708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763717 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f" exitCode=0 Apr 22 17:39:13.764019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763741 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c"} Apr 22 17:39:13.764019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763773 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253"} Apr 22 17:39:13.764019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763783 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c"} Apr 22 17:39:13.764019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763793 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74"} Apr 22 17:39:13.764019 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:13.763802 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f"} Apr 22 17:39:14.424443 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.424423 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.473874 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.473848 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474025 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.473912 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474025 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.473942 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474025 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.473975 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474037 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474062 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474094 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474129 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474183 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474163 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474419 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474187 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474419 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474225 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474419 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474262 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7klj\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474419 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474294 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web\") pod \"d9605e64-41c6-4052-ad0f-04c386fbe607\" (UID: \"d9605e64-41c6-4052-ad0f-04c386fbe607\") " Apr 22 17:39:14.474419 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474329 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:14.474656 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474542 2539 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-metrics-client-ca\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.475136 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.474894 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:39:14.475526 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.475475 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:14.477717 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.477680 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out" (OuterVolumeSpecName: "config-out") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:39:14.478484 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.478440 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.478590 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.478531 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.478779 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.478753 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.478863 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.478785 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:14.478863 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.478794 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.479082 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.479052 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj" (OuterVolumeSpecName: "kube-api-access-f7klj") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "kube-api-access-f7klj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:14.479627 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.479601 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.483121 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.482953 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.489637 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.489611 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config" (OuterVolumeSpecName: "web-config") pod "d9605e64-41c6-4052-ad0f-04c386fbe607" (UID: "d9605e64-41c6-4052-ad0f-04c386fbe607"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:14.575264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575204 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-main-tls\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575229 2539 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-web-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575240 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575250 2539 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-main-db\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575264 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575261 2539 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9605e64-41c6-4052-ad0f-04c386fbe607-config-out\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575269 2539 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-cluster-tls-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575279 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575288 2539 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-config-volume\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575297 2539 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-tls-assets\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575304 2539 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d9605e64-41c6-4052-ad0f-04c386fbe607-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575313 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7klj\" (UniqueName: \"kubernetes.io/projected/d9605e64-41c6-4052-ad0f-04c386fbe607-kube-api-access-f7klj\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.575485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.575322 2539 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9605e64-41c6-4052-ad0f-04c386fbe607-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:14.769668 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.769635 2539 generic.go:358] "Generic (PLEG): container finished" podID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerID="346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa" exitCode=0 Apr 22 17:39:14.769791 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.769729 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa"} Apr 22 17:39:14.769791 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.769755 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.769791 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.769775 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d9605e64-41c6-4052-ad0f-04c386fbe607","Type":"ContainerDied","Data":"d4936663e35508deed327e06bddece70bcc87e00d0594566afce7bd0f410432c"} Apr 22 17:39:14.769888 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.769797 2539 scope.go:117] "RemoveContainer" containerID="0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c" Apr 22 17:39:14.777256 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.777232 2539 scope.go:117] "RemoveContainer" containerID="fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253" Apr 22 17:39:14.783944 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.783926 2539 scope.go:117] "RemoveContainer" containerID="c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c" Apr 22 17:39:14.790021 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.790003 2539 scope.go:117] "RemoveContainer" containerID="346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa" Apr 22 17:39:14.794178 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.794154 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:14.796864 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.796844 2539 scope.go:117] "RemoveContainer" containerID="4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74" Apr 22 17:39:14.800463 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.800443 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:14.803463 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.803448 2539 scope.go:117] "RemoveContainer" containerID="9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f" Apr 22 17:39:14.809839 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.809824 2539 scope.go:117] "RemoveContainer" containerID="c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee" Apr 22 17:39:14.816203 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816185 2539 scope.go:117] "RemoveContainer" containerID="0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c" Apr 22 17:39:14.816426 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.816399 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c\": container with ID starting with 0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c not found: ID does not exist" containerID="0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c" Apr 22 17:39:14.816465 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816436 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c"} err="failed to get container status \"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c\": rpc error: code = NotFound desc = could not find container \"0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c\": container with ID starting with 0c00981be21dc2ce3d61e0284288c108bd085577469926cda483ac3cacd7a67c not found: ID does not exist" Apr 22 17:39:14.816465 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816454 2539 scope.go:117] "RemoveContainer" containerID="fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253" Apr 22 17:39:14.816675 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.816661 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253\": container with ID starting with fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253 not found: ID does not exist" containerID="fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253" Apr 22 17:39:14.816714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816678 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253"} err="failed to get container status \"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253\": rpc error: code = NotFound desc = could not find container \"fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253\": container with ID starting with fa5cc2b4610201ce13dcb6b8a4db45cf826444dfe32126615ad4f188de006253 not found: ID does not exist" Apr 22 17:39:14.816714 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816691 2539 scope.go:117] "RemoveContainer" containerID="c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c" Apr 22 17:39:14.816885 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.816870 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c\": container with ID starting with c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c not found: ID does not exist" containerID="c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c" Apr 22 17:39:14.817028 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816889 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c"} err="failed to get container status \"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c\": rpc error: code = NotFound desc = could not find container \"c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c\": container with ID starting with c904c836ae1bfe1bef936c1a8d489fbe53b9622e839fdb4d9f02b38182e3a23c not found: ID does not exist" Apr 22 17:39:14.817028 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.816914 2539 scope.go:117] "RemoveContainer" containerID="346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa" Apr 22 17:39:14.817138 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.817118 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa\": container with ID starting with 346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa not found: ID does not exist" containerID="346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa" Apr 22 17:39:14.817170 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817143 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa"} err="failed to get container status \"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa\": rpc error: code = NotFound desc = could not find container \"346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa\": container with ID starting with 346e8f255c9bcf32488e7e3d79f359cbbdceccda98e236c28a8feb6c234928fa not found: ID does not exist" Apr 22 17:39:14.817170 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817158 2539 scope.go:117] "RemoveContainer" containerID="4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74" Apr 22 17:39:14.817389 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.817368 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74\": container with ID starting with 4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74 not found: ID does not exist" containerID="4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74" Apr 22 17:39:14.817444 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817392 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74"} err="failed to get container status \"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74\": rpc error: code = NotFound desc = could not find container \"4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74\": container with ID starting with 4e61cb4057ae82991620cedf09fb72a6f35de591cff33e5845f6d7553e9cbc74 not found: ID does not exist" Apr 22 17:39:14.817444 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817404 2539 scope.go:117] "RemoveContainer" containerID="9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f" Apr 22 17:39:14.817617 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.817601 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f\": container with ID starting with 9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f not found: ID does not exist" containerID="9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f" Apr 22 17:39:14.817657 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817621 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f"} err="failed to get container status \"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f\": rpc error: code = NotFound desc = could not find container \"9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f\": container with ID starting with 9526ad3fc3f695c96f0f176cdf798557f8b48ad9fe7234988ad1cf46398b295f not found: ID does not exist" Apr 22 17:39:14.817657 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817633 2539 scope.go:117] "RemoveContainer" containerID="c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee" Apr 22 17:39:14.817844 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:14.817829 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee\": container with ID starting with c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee not found: ID does not exist" containerID="c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee" Apr 22 17:39:14.817886 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.817847 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee"} err="failed to get container status \"c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee\": rpc error: code = NotFound desc = could not find container \"c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee\": container with ID starting with c318ae6a026cdd1faf7cec43ab5491bf174c99f93b0c7e37f3b8d682b987c5ee not found: ID does not exist" Apr 22 17:39:14.823756 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.823737 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:14.824051 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824026 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="init-config-reloader" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824054 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="init-config-reloader" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824063 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-web" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824071 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-web" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824084 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="config-reloader" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824089 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="config-reloader" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824099 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="prom-label-proxy" Apr 22 17:39:14.824105 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824105 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="prom-label-proxy" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824112 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-metric" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824117 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-metric" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824127 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="alertmanager" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824132 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="alertmanager" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824138 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824146 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824204 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-metric" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824213 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy-web" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824221 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="alertmanager" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824229 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="prom-label-proxy" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824237 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="kube-rbac-proxy" Apr 22 17:39:14.824368 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.824245 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" containerName="config-reloader" Apr 22 17:39:14.828847 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.828806 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.830920 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.830881 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:39:14.830920 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.830894 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:39:14.831099 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.830926 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:39:14.831099 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.830916 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:39:14.831099 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.831078 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vln98\"" Apr 22 17:39:14.831249 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.831102 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:39:14.831403 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.831382 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:39:14.831491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.831423 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:39:14.831491 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.831474 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:39:14.836465 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.836444 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:39:14.839410 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.839393 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:14.876604 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876581 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-config-out\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876687 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876617 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876687 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876646 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876687 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876665 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876691 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876717 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876739 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876753 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876795 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876774 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876982 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876846 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-web-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876982 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876890 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.876982 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876963 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.877081 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.876985 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6t9v\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-kube-api-access-t6t9v\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978120 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978085 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-config-out\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978219 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978130 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978219 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978159 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978219 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978184 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978324 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978216 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978324 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978244 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978324 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978277 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978392 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978434 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978564 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978481 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-web-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978564 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978551 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978664 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978596 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.978664 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.978623 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6t9v\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-kube-api-access-t6t9v\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.980703 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.979994 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.980882 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.980661 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46c18525-4d26-4b0e-ba66-434b189645d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981261 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981209 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981261 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981241 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981408 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981337 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981589 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981566 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-config-out\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981691 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981631 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981691 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981673 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/46c18525-4d26-4b0e-ba66-434b189645d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981857 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981836 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.981895 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.981835 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.983179 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.983150 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.983254 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.983234 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46c18525-4d26-4b0e-ba66-434b189645d1-web-config\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:14.986327 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:14.986311 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6t9v\" (UniqueName: \"kubernetes.io/projected/46c18525-4d26-4b0e-ba66-434b189645d1-kube-api-access-t6t9v\") pod \"alertmanager-main-0\" (UID: \"46c18525-4d26-4b0e-ba66-434b189645d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:15.138876 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.138827 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:39:15.266600 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.266575 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:39:15.268714 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:39:15.268689 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c18525_4d26_4b0e_ba66_434b189645d1.slice/crio-bd56094d1166b1796eed33b70f8030e9ce56f90559c89fa3ebda647acf67d0a3 WatchSource:0}: Error finding container bd56094d1166b1796eed33b70f8030e9ce56f90559c89fa3ebda647acf67d0a3: Status 404 returned error can't find the container with id bd56094d1166b1796eed33b70f8030e9ce56f90559c89fa3ebda647acf67d0a3 Apr 22 17:39:15.773921 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.773874 2539 generic.go:358] "Generic (PLEG): container finished" podID="46c18525-4d26-4b0e-ba66-434b189645d1" containerID="6ba570e0bbeee35482cff74b0c66126683d7fe4743d7804571767d5f6b3826ae" exitCode=0 Apr 22 17:39:15.774374 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.773959 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerDied","Data":"6ba570e0bbeee35482cff74b0c66126683d7fe4743d7804571767d5f6b3826ae"} Apr 22 17:39:15.774374 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.773996 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"bd56094d1166b1796eed33b70f8030e9ce56f90559c89fa3ebda647acf67d0a3"} Apr 22 17:39:15.846689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:15.846644 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9605e64-41c6-4052-ad0f-04c386fbe607" path="/var/lib/kubelet/pods/d9605e64-41c6-4052-ad0f-04c386fbe607/volumes" Apr 22 17:39:16.781129 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781094 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"0fc933c534248952b74211f6e92028abf3f64ceba48993527dba977b1231c72f"} Apr 22 17:39:16.781129 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781128 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"e7564b31b8c5b6fa8e52e4a6239a7440a3c0c2a423a5dc1728a59a37b34c14ea"} Apr 22 17:39:16.781129 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781137 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"ef467abd5dcaf0eb8f14538843288dc72474fb3ed19d645d59ec39dd56de72f6"} Apr 22 17:39:16.781694 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781146 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"635e52b584af7c6213aa0070304ce56c89e756ad6ba8e38a2744c1be73e0c7ef"} Apr 22 17:39:16.781694 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781154 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"97c5ea600390cb154a0e847bc456d31c8cb4aaaa8c81575327ff9585aad411e0"} Apr 22 17:39:16.781694 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.781161 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"46c18525-4d26-4b0e-ba66-434b189645d1","Type":"ContainerStarted","Data":"63484466cd5dd923a6f15d3543fd196a0f009c32af83a1a81776a0bf147a7ea1"} Apr 22 17:39:16.810145 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:16.810096 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.810081056 podStartE2EDuration="2.810081056s" podCreationTimestamp="2026-04-22 17:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:39:16.808378114 +0000 UTC m=+265.564462687" watchObservedRunningTime="2026-04-22 17:39:16.810081056 +0000 UTC m=+265.566165629" Apr 22 17:39:19.945476 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:19.945430 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:19.945476 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:19.945481 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:19.950198 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:19.950175 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:20.796319 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:20.796292 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:39:20.842801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:20.842768 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:39:45.864308 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:45.864233 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55f9bf9b9c-zrz2b" podUID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" containerName="console" containerID="cri-o://ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c" gracePeriod=15 Apr 22 17:39:46.102615 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.102593 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f9bf9b9c-zrz2b_2beb1ad6-4fd9-482f-830b-c3d7dadc7c01/console/0.log" Apr 22 17:39:46.102728 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.102650 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:39:46.235261 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235228 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235277 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvx7f\" (UniqueName: \"kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235295 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235316 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235336 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235363 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235391 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235390 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config\") pod \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\" (UID: \"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01\") " Apr 22 17:39:46.235779 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235750 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config" (OuterVolumeSpecName: "console-config") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:46.235945 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235782 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:46.235945 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235788 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca" (OuterVolumeSpecName: "service-ca") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:46.235945 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.235795 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:39:46.237363 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.237335 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:46.237444 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.237412 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:39:46.237485 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.237435 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f" (OuterVolumeSpecName: "kube-api-access-zvx7f") pod "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" (UID: "2beb1ad6-4fd9-482f-830b-c3d7dadc7c01"). InnerVolumeSpecName "kube-api-access-zvx7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:39:46.336622 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336598 2539 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336622 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336620 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvx7f\" (UniqueName: \"kubernetes.io/projected/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-kube-api-access-zvx7f\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336630 2539 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-trusted-ca-bundle\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336639 2539 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-oauth-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336648 2539 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-service-ca\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336656 2539 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.336743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.336665 2539 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01-console-oauth-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:39:46.870479 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870455 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55f9bf9b9c-zrz2b_2beb1ad6-4fd9-482f-830b-c3d7dadc7c01/console/0.log" Apr 22 17:39:46.870856 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870494 2539 generic.go:358] "Generic (PLEG): container finished" podID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" containerID="ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c" exitCode=2 Apr 22 17:39:46.870856 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870543 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f9bf9b9c-zrz2b" event={"ID":"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01","Type":"ContainerDied","Data":"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c"} Apr 22 17:39:46.870856 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870570 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55f9bf9b9c-zrz2b" Apr 22 17:39:46.870856 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870587 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55f9bf9b9c-zrz2b" event={"ID":"2beb1ad6-4fd9-482f-830b-c3d7dadc7c01","Type":"ContainerDied","Data":"b768818b935934d8f2d0eadd2efd074877a5439303e05a9d7a451fd5501cef2f"} Apr 22 17:39:46.870856 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.870604 2539 scope.go:117] "RemoveContainer" containerID="ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c" Apr 22 17:39:46.879554 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.879536 2539 scope.go:117] "RemoveContainer" containerID="ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c" Apr 22 17:39:46.879811 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:39:46.879794 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c\": container with ID starting with ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c not found: ID does not exist" containerID="ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c" Apr 22 17:39:46.879855 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.879817 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c"} err="failed to get container status \"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c\": rpc error: code = NotFound desc = could not find container \"ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c\": container with ID starting with ed7d765b162d3b5bfb7057facf543458e4f7ad7fb9fb5333143c44c4df4f6e2c not found: ID does not exist" Apr 22 17:39:46.891075 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.891046 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:39:46.894360 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:46.894333 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55f9bf9b9c-zrz2b"] Apr 22 17:39:47.846014 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:47.845975 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" path="/var/lib/kubelet/pods/2beb1ad6-4fd9-482f-830b-c3d7dadc7c01/volumes" Apr 22 17:39:51.728126 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:51.728097 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:39:51.728604 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:51.728285 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:39:51.742274 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:39:51.742251 2539 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:40:32.445525 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.445482 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb"] Apr 22 17:40:32.446069 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.445944 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" containerName="console" Apr 22 17:40:32.446069 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.445962 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" containerName="console" Apr 22 17:40:32.446069 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.446040 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="2beb1ad6-4fd9-482f-830b-c3d7dadc7c01" containerName="console" Apr 22 17:40:32.448074 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.448053 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.450207 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.450185 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 17:40:32.450805 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.450785 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-9llf4\"" Apr 22 17:40:32.450805 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.450798 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 17:40:32.450924 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.450856 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 17:40:32.461862 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.461842 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb"] Apr 22 17:40:32.602530 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.602495 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46c00965-40df-4923-a7fd-57982c652e52-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.602696 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.602554 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p47pf\" (UniqueName: \"kubernetes.io/projected/46c00965-40df-4923-a7fd-57982c652e52-kube-api-access-p47pf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.703474 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.703440 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p47pf\" (UniqueName: \"kubernetes.io/projected/46c00965-40df-4923-a7fd-57982c652e52-kube-api-access-p47pf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.703632 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.703525 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46c00965-40df-4923-a7fd-57982c652e52-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.705821 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.705799 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/46c00965-40df-4923-a7fd-57982c652e52-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.711339 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.711311 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p47pf\" (UniqueName: \"kubernetes.io/projected/46c00965-40df-4923-a7fd-57982c652e52-kube-api-access-p47pf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb\" (UID: \"46c00965-40df-4923-a7fd-57982c652e52\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.758185 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.758146 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:32.901579 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.901551 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb"] Apr 22 17:40:32.904409 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:40:32.904381 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46c00965_40df_4923_a7fd_57982c652e52.slice/crio-823f782c8f84efb70456b727cf24a0a47fcf1d0064f6c44634689dee523fc345 WatchSource:0}: Error finding container 823f782c8f84efb70456b727cf24a0a47fcf1d0064f6c44634689dee523fc345: Status 404 returned error can't find the container with id 823f782c8f84efb70456b727cf24a0a47fcf1d0064f6c44634689dee523fc345 Apr 22 17:40:32.906060 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:32.906039 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:40:33.000894 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:33.000817 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" event={"ID":"46c00965-40df-4923-a7fd-57982c652e52","Type":"ContainerStarted","Data":"823f782c8f84efb70456b727cf24a0a47fcf1d0064f6c44634689dee523fc345"} Apr 22 17:40:37.018263 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.018222 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" event={"ID":"46c00965-40df-4923-a7fd-57982c652e52","Type":"ContainerStarted","Data":"e767dc7def87759b66b3e9985a2aa7c3bac28587013f852d362a22f1cfb86b25"} Apr 22 17:40:37.018628 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.018447 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:40:37.043855 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.043806 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" podStartSLOduration=1.444394768 podStartE2EDuration="5.04379417s" podCreationTimestamp="2026-04-22 17:40:32 +0000 UTC" firstStartedPulling="2026-04-22 17:40:32.906165962 +0000 UTC m=+341.662250515" lastFinishedPulling="2026-04-22 17:40:36.505565361 +0000 UTC m=+345.261649917" observedRunningTime="2026-04-22 17:40:37.042560136 +0000 UTC m=+345.798644708" watchObservedRunningTime="2026-04-22 17:40:37.04379417 +0000 UTC m=+345.799878742" Apr 22 17:40:37.610661 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.610627 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-qwc6h"] Apr 22 17:40:37.613713 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.613695 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.616005 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.615983 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 17:40:37.616119 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.615988 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-jc8ss\"" Apr 22 17:40:37.616170 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.616128 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 17:40:37.624543 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.624525 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qwc6h"] Apr 22 17:40:37.753758 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.753735 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-certificates\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.753893 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.753856 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9bz9\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-kube-api-access-n9bz9\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.855255 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.855228 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9bz9\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-kube-api-access-n9bz9\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.855386 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.855285 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-certificates\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.857829 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.857804 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-certificates\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.873890 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.873834 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9bz9\" (UniqueName: \"kubernetes.io/projected/e64ae358-cbf7-4408-957a-1249589a2d8a-kube-api-access-n9bz9\") pod \"keda-admission-cf49989db-qwc6h\" (UID: \"e64ae358-cbf7-4408-957a-1249589a2d8a\") " pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:37.923834 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:37.923807 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:38.041790 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:38.041765 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-qwc6h"] Apr 22 17:40:38.044418 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:40:38.044377 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode64ae358_cbf7_4408_957a_1249589a2d8a.slice/crio-4f168e0e33fb0057578d54b26e4789568032be592f16abee0835792c52d236a6 WatchSource:0}: Error finding container 4f168e0e33fb0057578d54b26e4789568032be592f16abee0835792c52d236a6: Status 404 returned error can't find the container with id 4f168e0e33fb0057578d54b26e4789568032be592f16abee0835792c52d236a6 Apr 22 17:40:39.026205 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:39.026168 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qwc6h" event={"ID":"e64ae358-cbf7-4408-957a-1249589a2d8a","Type":"ContainerStarted","Data":"4f168e0e33fb0057578d54b26e4789568032be592f16abee0835792c52d236a6"} Apr 22 17:40:40.030980 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:40.030941 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-qwc6h" event={"ID":"e64ae358-cbf7-4408-957a-1249589a2d8a","Type":"ContainerStarted","Data":"90c86fe63d04a8f4b4246c38a1403cac5941876b71fc326f85619d7d573d7525"} Apr 22 17:40:40.031371 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:40.031082 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:40:40.047629 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:40.047577 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-qwc6h" podStartSLOduration=1.4629194939999999 podStartE2EDuration="3.047560051s" podCreationTimestamp="2026-04-22 17:40:37 +0000 UTC" firstStartedPulling="2026-04-22 17:40:38.045661515 +0000 UTC m=+346.801746066" lastFinishedPulling="2026-04-22 17:40:39.630302066 +0000 UTC m=+348.386386623" observedRunningTime="2026-04-22 17:40:40.046530403 +0000 UTC m=+348.802614954" watchObservedRunningTime="2026-04-22 17:40:40.047560051 +0000 UTC m=+348.803644627" Apr 22 17:40:58.024814 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:40:58.024778 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-tk6sb" Apr 22 17:41:01.036474 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:01.036435 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-qwc6h" Apr 22 17:41:45.125316 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.125283 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:41:45.128627 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.128598 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.130043 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.130017 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9"] Apr 22 17:41:45.131607 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.131585 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 17:41:45.131724 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.131626 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:41:45.132238 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.132219 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-6stqs\"" Apr 22 17:41:45.132238 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.132230 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:41:45.133175 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.133158 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.135379 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.135356 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 17:41:45.135575 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.135557 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5vzjj\"" Apr 22 17:41:45.138743 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.138715 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:41:45.143615 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.143584 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9"] Apr 22 17:41:45.210545 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.210507 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpnx\" (UniqueName: \"kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.210545 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.210550 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.210806 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.210617 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc8g6\" (UniqueName: \"kubernetes.io/projected/ef8f9195-84bc-40db-8f31-73200f673df3-kube-api-access-bc8g6\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.210806 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.210711 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8f9195-84bc-40db-8f31-73200f673df3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.311535 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.311498 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpnx\" (UniqueName: \"kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.311727 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.311548 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.311727 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.311580 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc8g6\" (UniqueName: \"kubernetes.io/projected/ef8f9195-84bc-40db-8f31-73200f673df3-kube-api-access-bc8g6\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.311727 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.311631 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8f9195-84bc-40db-8f31-73200f673df3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.314051 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.314021 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.314167 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.314119 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8f9195-84bc-40db-8f31-73200f673df3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.319262 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.319235 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc8g6\" (UniqueName: \"kubernetes.io/projected/ef8f9195-84bc-40db-8f31-73200f673df3-kube-api-access-bc8g6\") pod \"llmisvc-controller-manager-68cc5db7c4-cjzn9\" (UID: \"ef8f9195-84bc-40db-8f31-73200f673df3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.319364 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.319277 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpnx\" (UniqueName: \"kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx\") pod \"kserve-controller-manager-84ffddfb66-54vsg\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.441664 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.441570 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:45.449422 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.449388 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:45.573085 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.573054 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:41:45.575984 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:41:45.575954 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c34fba_251f_4a5d_8bfe_a4dcb9d4016d.slice/crio-60e377d6a61fb62575a3f06a2ee9f9b1eb5d334fdef16b42bdd8ef695bdd9f20 WatchSource:0}: Error finding container 60e377d6a61fb62575a3f06a2ee9f9b1eb5d334fdef16b42bdd8ef695bdd9f20: Status 404 returned error can't find the container with id 60e377d6a61fb62575a3f06a2ee9f9b1eb5d334fdef16b42bdd8ef695bdd9f20 Apr 22 17:41:45.595880 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:45.595850 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9"] Apr 22 17:41:45.599188 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:41:45.599159 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podef8f9195_84bc_40db_8f31_73200f673df3.slice/crio-b6bfa3bb43cbf72ea0ba2e104e951ae310d2efa15266da01e4ada5f5199769ef WatchSource:0}: Error finding container b6bfa3bb43cbf72ea0ba2e104e951ae310d2efa15266da01e4ada5f5199769ef: Status 404 returned error can't find the container with id b6bfa3bb43cbf72ea0ba2e104e951ae310d2efa15266da01e4ada5f5199769ef Apr 22 17:41:46.248186 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:46.248148 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" event={"ID":"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d","Type":"ContainerStarted","Data":"60e377d6a61fb62575a3f06a2ee9f9b1eb5d334fdef16b42bdd8ef695bdd9f20"} Apr 22 17:41:46.249654 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:46.249622 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" event={"ID":"ef8f9195-84bc-40db-8f31-73200f673df3","Type":"ContainerStarted","Data":"b6bfa3bb43cbf72ea0ba2e104e951ae310d2efa15266da01e4ada5f5199769ef"} Apr 22 17:41:49.261831 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.261792 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" event={"ID":"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d","Type":"ContainerStarted","Data":"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6"} Apr 22 17:41:49.262256 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.261864 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:41:49.263099 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.263071 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" event={"ID":"ef8f9195-84bc-40db-8f31-73200f673df3","Type":"ContainerStarted","Data":"726e9a30f4168bf6f254ace8167f599b11666edc3dbff2d2397a3b701ef8ba3a"} Apr 22 17:41:49.263226 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.263191 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:41:49.278263 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.278225 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" podStartSLOduration=0.989590949 podStartE2EDuration="4.278212488s" podCreationTimestamp="2026-04-22 17:41:45 +0000 UTC" firstStartedPulling="2026-04-22 17:41:45.577755406 +0000 UTC m=+414.333839965" lastFinishedPulling="2026-04-22 17:41:48.866376949 +0000 UTC m=+417.622461504" observedRunningTime="2026-04-22 17:41:49.277684315 +0000 UTC m=+418.033768884" watchObservedRunningTime="2026-04-22 17:41:49.278212488 +0000 UTC m=+418.034297060" Apr 22 17:41:49.292390 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:41:49.292343 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" podStartSLOduration=0.989258705 podStartE2EDuration="4.292330351s" podCreationTimestamp="2026-04-22 17:41:45 +0000 UTC" firstStartedPulling="2026-04-22 17:41:45.600469533 +0000 UTC m=+414.356554084" lastFinishedPulling="2026-04-22 17:41:48.903541176 +0000 UTC m=+417.659625730" observedRunningTime="2026-04-22 17:41:49.291820237 +0000 UTC m=+418.047904809" watchObservedRunningTime="2026-04-22 17:41:49.292330351 +0000 UTC m=+418.048414925" Apr 22 17:42:20.268537 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:20.268448 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-cjzn9" Apr 22 17:42:20.271708 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:20.271688 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:42:21.425528 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.425495 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:42:21.425958 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.425702 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" podUID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" containerName="manager" containerID="cri-o://3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6" gracePeriod=10 Apr 22 17:42:21.447568 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.447541 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-mxsvq"] Apr 22 17:42:21.450761 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.450745 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.458682 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.458652 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-mxsvq"] Apr 22 17:42:21.515477 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.515445 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6jr\" (UniqueName: \"kubernetes.io/projected/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-kube-api-access-pt6jr\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.515664 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.515493 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-cert\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.616121 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.616078 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6jr\" (UniqueName: \"kubernetes.io/projected/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-kube-api-access-pt6jr\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.616297 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.616143 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-cert\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.618740 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.618712 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-cert\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.624110 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.624086 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6jr\" (UniqueName: \"kubernetes.io/projected/6cf17ecc-ffab-4ad3-9b83-1f427318bee0-kube-api-access-pt6jr\") pod \"kserve-controller-manager-84ffddfb66-mxsvq\" (UID: \"6cf17ecc-ffab-4ad3-9b83-1f427318bee0\") " pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.667225 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.667198 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:42:21.797453 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.797406 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:21.818423 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.818392 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mpnx\" (UniqueName: \"kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx\") pod \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " Apr 22 17:42:21.818570 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.818439 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert\") pod \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\" (UID: \"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d\") " Apr 22 17:42:21.820594 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.820563 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert" (OuterVolumeSpecName: "cert") pod "36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" (UID: "36c34fba-251f-4a5d-8bfe-a4dcb9d4016d"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:42:21.820689 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.820607 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx" (OuterVolumeSpecName: "kube-api-access-5mpnx") pod "36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" (UID: "36c34fba-251f-4a5d-8bfe-a4dcb9d4016d"). InnerVolumeSpecName "kube-api-access-5mpnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:42:21.919481 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.919450 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mpnx\" (UniqueName: \"kubernetes.io/projected/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-kube-api-access-5mpnx\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:42:21.919481 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.919479 2539 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:42:21.925235 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:21.925210 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-mxsvq"] Apr 22 17:42:21.927575 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:42:21.927550 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cf17ecc_ffab_4ad3_9b83_1f427318bee0.slice/crio-79197bcd9af3d12513a7140dc0c47c1cfa5b52ce69e07fd66bfd485a639b8a03 WatchSource:0}: Error finding container 79197bcd9af3d12513a7140dc0c47c1cfa5b52ce69e07fd66bfd485a639b8a03: Status 404 returned error can't find the container with id 79197bcd9af3d12513a7140dc0c47c1cfa5b52ce69e07fd66bfd485a639b8a03 Apr 22 17:42:22.375658 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.375631 2539 generic.go:358] "Generic (PLEG): container finished" podID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" containerID="3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6" exitCode=0 Apr 22 17:42:22.375788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.375696 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" Apr 22 17:42:22.375788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.375707 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" event={"ID":"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d","Type":"ContainerDied","Data":"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6"} Apr 22 17:42:22.375788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.375746 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-54vsg" event={"ID":"36c34fba-251f-4a5d-8bfe-a4dcb9d4016d","Type":"ContainerDied","Data":"60e377d6a61fb62575a3f06a2ee9f9b1eb5d334fdef16b42bdd8ef695bdd9f20"} Apr 22 17:42:22.375788 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.375764 2539 scope.go:117] "RemoveContainer" containerID="3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6" Apr 22 17:42:22.377150 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.377115 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" event={"ID":"6cf17ecc-ffab-4ad3-9b83-1f427318bee0","Type":"ContainerStarted","Data":"79197bcd9af3d12513a7140dc0c47c1cfa5b52ce69e07fd66bfd485a639b8a03"} Apr 22 17:42:22.384085 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.384064 2539 scope.go:117] "RemoveContainer" containerID="3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6" Apr 22 17:42:22.384474 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:42:22.384442 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6\": container with ID starting with 3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6 not found: ID does not exist" containerID="3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6" Apr 22 17:42:22.384596 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.384493 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6"} err="failed to get container status \"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6\": rpc error: code = NotFound desc = could not find container \"3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6\": container with ID starting with 3917b6e259b6ce5200bc558549b0b5c5c499761897b20afebf1bc0b9cb3841f6 not found: ID does not exist" Apr 22 17:42:22.392740 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.392712 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:42:22.396255 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:22.396239 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84ffddfb66-54vsg"] Apr 22 17:42:23.381986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:23.381947 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" event={"ID":"6cf17ecc-ffab-4ad3-9b83-1f427318bee0","Type":"ContainerStarted","Data":"7e324052f8ac683dc2777a8da8a7c5df00535abaafb2446153ee0762ba7ba6c2"} Apr 22 17:42:23.382411 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:23.382209 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:23.400535 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:23.400483 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" podStartSLOduration=1.980768246 podStartE2EDuration="2.400467778s" podCreationTimestamp="2026-04-22 17:42:21 +0000 UTC" firstStartedPulling="2026-04-22 17:42:21.928783877 +0000 UTC m=+450.684868429" lastFinishedPulling="2026-04-22 17:42:22.348483406 +0000 UTC m=+451.104567961" observedRunningTime="2026-04-22 17:42:23.398279936 +0000 UTC m=+452.154364509" watchObservedRunningTime="2026-04-22 17:42:23.400467778 +0000 UTC m=+452.156552350" Apr 22 17:42:23.846407 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:23.846373 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" path="/var/lib/kubelet/pods/36c34fba-251f-4a5d-8bfe-a4dcb9d4016d/volumes" Apr 22 17:42:54.390276 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:54.390244 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84ffddfb66-mxsvq" Apr 22 17:42:55.222089 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.222057 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-xcwvx"] Apr 22 17:42:55.222430 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.222416 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" containerName="manager" Apr 22 17:42:55.222483 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.222432 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" containerName="manager" Apr 22 17:42:55.222527 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.222492 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="36c34fba-251f-4a5d-8bfe-a4dcb9d4016d" containerName="manager" Apr 22 17:42:55.226759 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.226738 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.229439 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.229416 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-j8w29\"" Apr 22 17:42:55.229542 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.229463 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 17:42:55.244010 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.243987 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-xcwvx"] Apr 22 17:42:55.273240 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.273217 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.273339 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.273254 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6vh\" (UniqueName: \"kubernetes.io/projected/b9826387-b818-4918-879a-509f8283c2b3-kube-api-access-ft6vh\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.373878 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.373854 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.373993 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.373886 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6vh\" (UniqueName: \"kubernetes.io/projected/b9826387-b818-4918-879a-509f8283c2b3-kube-api-access-ft6vh\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.374048 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:42:55.373991 2539 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 17:42:55.374048 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:42:55.374045 2539 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert podName:b9826387-b818-4918-879a-509f8283c2b3 nodeName:}" failed. No retries permitted until 2026-04-22 17:42:55.874029861 +0000 UTC m=+484.630114417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert") pod "odh-model-controller-696fc77849-xcwvx" (UID: "b9826387-b818-4918-879a-509f8283c2b3") : secret "odh-model-controller-webhook-cert" not found Apr 22 17:42:55.382335 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.382315 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6vh\" (UniqueName: \"kubernetes.io/projected/b9826387-b818-4918-879a-509f8283c2b3-kube-api-access-ft6vh\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.876777 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.876750 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:55.879208 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:55.879185 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9826387-b818-4918-879a-509f8283c2b3-cert\") pod \"odh-model-controller-696fc77849-xcwvx\" (UID: \"b9826387-b818-4918-879a-509f8283c2b3\") " pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:56.138203 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:56.138139 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:42:56.255160 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:56.255118 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-xcwvx"] Apr 22 17:42:56.257398 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:42:56.257373 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9826387_b818_4918_879a_509f8283c2b3.slice/crio-08a369ccb082cad89179063eb7825d0f2cd73b2d52ae1b864dcf23ef26035ec5 WatchSource:0}: Error finding container 08a369ccb082cad89179063eb7825d0f2cd73b2d52ae1b864dcf23ef26035ec5: Status 404 returned error can't find the container with id 08a369ccb082cad89179063eb7825d0f2cd73b2d52ae1b864dcf23ef26035ec5 Apr 22 17:42:56.486838 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:56.486805 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-xcwvx" event={"ID":"b9826387-b818-4918-879a-509f8283c2b3","Type":"ContainerStarted","Data":"08a369ccb082cad89179063eb7825d0f2cd73b2d52ae1b864dcf23ef26035ec5"} Apr 22 17:42:59.501160 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:59.501129 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-xcwvx" event={"ID":"b9826387-b818-4918-879a-509f8283c2b3","Type":"ContainerStarted","Data":"c0337f813fea15ed95a799b196fbc9041fafcc5749d1fec35f144399d589d411"} Apr 22 17:42:59.501518 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:42:59.501252 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:43:03.907644 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.907574 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-xcwvx" podStartSLOduration=6.526922066 podStartE2EDuration="8.907557622s" podCreationTimestamp="2026-04-22 17:42:55 +0000 UTC" firstStartedPulling="2026-04-22 17:42:56.259114589 +0000 UTC m=+485.015199139" lastFinishedPulling="2026-04-22 17:42:58.63975014 +0000 UTC m=+487.395834695" observedRunningTime="2026-04-22 17:42:59.523574613 +0000 UTC m=+488.279659322" watchObservedRunningTime="2026-04-22 17:43:03.907557622 +0000 UTC m=+492.663642250" Apr 22 17:43:03.908515 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.908498 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5fbf9d9-dbvfp"] Apr 22 17:43:03.911826 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.911806 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.922873 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.922847 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5fbf9d9-dbvfp"] Apr 22 17:43:03.941025 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.940996 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-oauth-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941045 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-console-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941063 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-trusted-ca-bundle\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941080 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fgth\" (UniqueName: \"kubernetes.io/projected/7b23f8da-7138-4b3a-9239-4eb613442800-kube-api-access-8fgth\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941106 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941135 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-oauth-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:03.941333 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:03.941181 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-service-ca\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.041771 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041737 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-console-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.041771 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041772 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-trusted-ca-bundle\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042010 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041788 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fgth\" (UniqueName: \"kubernetes.io/projected/7b23f8da-7138-4b3a-9239-4eb613442800-kube-api-access-8fgth\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042010 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041807 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042010 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041833 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-oauth-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042010 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.041932 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-service-ca\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042230 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.042024 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-oauth-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042599 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.042569 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-console-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042729 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.042655 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-oauth-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042729 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.042709 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-service-ca\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.042852 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.042765 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b23f8da-7138-4b3a-9239-4eb613442800-trusted-ca-bundle\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.044410 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.044388 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-oauth-config\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.044501 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.044400 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b23f8da-7138-4b3a-9239-4eb613442800-console-serving-cert\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.050475 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.050452 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fgth\" (UniqueName: \"kubernetes.io/projected/7b23f8da-7138-4b3a-9239-4eb613442800-kube-api-access-8fgth\") pod \"console-7f5fbf9d9-dbvfp\" (UID: \"7b23f8da-7138-4b3a-9239-4eb613442800\") " pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.224618 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.224584 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:04.350290 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.350266 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5fbf9d9-dbvfp"] Apr 22 17:43:04.352318 ip-10-0-132-165 kubenswrapper[2539]: W0422 17:43:04.352289 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b23f8da_7138_4b3a_9239_4eb613442800.slice/crio-8e7ab40c0d50cfbcd468a25ff4869b495a50183463928729fdb43ba1870550cb WatchSource:0}: Error finding container 8e7ab40c0d50cfbcd468a25ff4869b495a50183463928729fdb43ba1870550cb: Status 404 returned error can't find the container with id 8e7ab40c0d50cfbcd468a25ff4869b495a50183463928729fdb43ba1870550cb Apr 22 17:43:04.518801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.518710 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5fbf9d9-dbvfp" event={"ID":"7b23f8da-7138-4b3a-9239-4eb613442800","Type":"ContainerStarted","Data":"fc381a34001cdb5c429daaa78432dc53a3c5a579315ce900e564ead53a813919"} Apr 22 17:43:04.518801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.518747 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5fbf9d9-dbvfp" event={"ID":"7b23f8da-7138-4b3a-9239-4eb613442800","Type":"ContainerStarted","Data":"8e7ab40c0d50cfbcd468a25ff4869b495a50183463928729fdb43ba1870550cb"} Apr 22 17:43:04.537820 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:04.537767 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5fbf9d9-dbvfp" podStartSLOduration=1.537753061 podStartE2EDuration="1.537753061s" podCreationTimestamp="2026-04-22 17:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:43:04.536085524 +0000 UTC m=+493.292170096" watchObservedRunningTime="2026-04-22 17:43:04.537753061 +0000 UTC m=+493.293837634" Apr 22 17:43:10.507723 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:10.507688 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-xcwvx" Apr 22 17:43:14.225406 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:14.225356 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:14.225406 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:14.225403 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:14.230284 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:14.230260 2539 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:14.557122 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:14.557013 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5fbf9d9-dbvfp" Apr 22 17:43:14.601977 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:14.601938 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:43:39.625047 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:39.624990 2539 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75668dc9d-clpwt" podUID="beb72c82-6907-4043-bea9-b3a01212c1e8" containerName="console" containerID="cri-o://89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f" gracePeriod=15 Apr 22 17:43:39.870173 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:39.870151 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75668dc9d-clpwt_beb72c82-6907-4043-bea9-b3a01212c1e8/console/0.log" Apr 22 17:43:39.870285 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:39.870211 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:43:40.041149 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041109 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5mzc\" (UniqueName: \"kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041179 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041267 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041313 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041299 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041480 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041331 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041480 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041365 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041480 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041421 2539 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle\") pod \"beb72c82-6907-4043-bea9-b3a01212c1e8\" (UID: \"beb72c82-6907-4043-bea9-b3a01212c1e8\") " Apr 22 17:43:40.041630 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041550 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca" (OuterVolumeSpecName: "service-ca") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:43:40.041630 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041618 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:43:40.041827 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041803 2539 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-service-ca\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.041827 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041816 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config" (OuterVolumeSpecName: "console-config") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:43:40.042021 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041826 2539 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-oauth-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.042021 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.041865 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:43:40.043605 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.043584 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc" (OuterVolumeSpecName: "kube-api-access-k5mzc") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "kube-api-access-k5mzc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:43:40.043800 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.043778 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:43:40.043848 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.043805 2539 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "beb72c82-6907-4043-bea9-b3a01212c1e8" (UID: "beb72c82-6907-4043-bea9-b3a01212c1e8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:43:40.142829 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.142807 2539 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-console-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.142829 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.142828 2539 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beb72c82-6907-4043-bea9-b3a01212c1e8-trusted-ca-bundle\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.142973 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.142838 2539 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5mzc\" (UniqueName: \"kubernetes.io/projected/beb72c82-6907-4043-bea9-b3a01212c1e8-kube-api-access-k5mzc\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.142973 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.142847 2539 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-oauth-config\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.142973 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.142855 2539 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/beb72c82-6907-4043-bea9-b3a01212c1e8-console-serving-cert\") on node \"ip-10-0-132-165.ec2.internal\" DevicePath \"\"" Apr 22 17:43:40.644407 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644381 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75668dc9d-clpwt_beb72c82-6907-4043-bea9-b3a01212c1e8/console/0.log" Apr 22 17:43:40.644801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644420 2539 generic.go:358] "Generic (PLEG): container finished" podID="beb72c82-6907-4043-bea9-b3a01212c1e8" containerID="89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f" exitCode=2 Apr 22 17:43:40.644801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644498 2539 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75668dc9d-clpwt" Apr 22 17:43:40.644801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644505 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75668dc9d-clpwt" event={"ID":"beb72c82-6907-4043-bea9-b3a01212c1e8","Type":"ContainerDied","Data":"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f"} Apr 22 17:43:40.644801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644539 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75668dc9d-clpwt" event={"ID":"beb72c82-6907-4043-bea9-b3a01212c1e8","Type":"ContainerDied","Data":"1026883d6340f5317105da694c816f6ed520a3e5d31a86f57d3d2f11d0fe3223"} Apr 22 17:43:40.644801 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.644554 2539 scope.go:117] "RemoveContainer" containerID="89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f" Apr 22 17:43:40.653064 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.653043 2539 scope.go:117] "RemoveContainer" containerID="89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f" Apr 22 17:43:40.653432 ip-10-0-132-165 kubenswrapper[2539]: E0422 17:43:40.653397 2539 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f\": container with ID starting with 89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f not found: ID does not exist" containerID="89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f" Apr 22 17:43:40.653529 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.653432 2539 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f"} err="failed to get container status \"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f\": rpc error: code = NotFound desc = could not find container \"89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f\": container with ID starting with 89f324c41130b9ae71ef5cdb1cf3ffa70ea768fcc332ef4eb20f48c55aac252f not found: ID does not exist" Apr 22 17:43:40.666940 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.666916 2539 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:43:40.668982 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:40.668962 2539 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75668dc9d-clpwt"] Apr 22 17:43:41.846747 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:43:41.846713 2539 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beb72c82-6907-4043-bea9-b3a01212c1e8" path="/var/lib/kubelet/pods/beb72c82-6907-4043-bea9-b3a01212c1e8/volumes" Apr 22 17:44:51.754091 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:44:51.754056 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:44:51.754675 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:44:51.754660 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:49:51.775932 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:49:51.775827 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:49:51.777590 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:49:51.777566 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:54:51.811219 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:54:51.811191 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:54:51.812354 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:54:51.812330 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:59:51.834986 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:59:51.834949 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 17:59:51.835850 ip-10-0-132-165 kubenswrapper[2539]: I0422 17:59:51.835832 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:04:51.858325 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:04:51.858296 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:04:51.864922 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:04:51.864873 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:09:51.883708 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:09:51.883676 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:09:51.888031 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:09:51.888008 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:14:51.905218 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:14:51.905092 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:14:51.909896 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:14:51.909875 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:19:51.929813 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:19:51.929694 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:19:51.935398 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:19:51.935378 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:24:51.951281 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:24:51.951257 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:24:51.957281 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:24:51.957248 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:29:51.973539 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:29:51.973410 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:29:51.981924 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:29:51.981882 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:34:51.995029 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:34:51.994922 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:34:52.007448 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:34:52.007426 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:39:52.019745 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:39:52.019625 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:39:52.030075 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:39:52.030057 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:44:09.118282 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:09.118245 2539 ???:1] "http: TLS handshake error from 10.0.132.165:35784: EOF" Apr 22 18:44:09.129813 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:09.129770 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mnmvf_e5741809-3555-46d2-99ca-58fd55e0acdc/global-pull-secret-syncer/0.log" Apr 22 18:44:09.276857 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:09.276827 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h2ghv_aa88e93d-3982-4f87-8dc7-cef2900ab3a3/konnectivity-agent/0.log" Apr 22 18:44:09.407435 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:09.407356 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-165.ec2.internal_136192d3e6d2f42c0abb843d3e675799/haproxy/0.log" Apr 22 18:44:13.038824 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.038791 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/alertmanager/0.log" Apr 22 18:44:13.074799 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.074774 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/config-reloader/0.log" Apr 22 18:44:13.105369 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.105341 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/kube-rbac-proxy-web/0.log" Apr 22 18:44:13.127714 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.127690 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/kube-rbac-proxy/0.log" Apr 22 18:44:13.152069 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.152038 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/kube-rbac-proxy-metric/0.log" Apr 22 18:44:13.178682 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.178659 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/prom-label-proxy/0.log" Apr 22 18:44:13.207096 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.207068 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_46c18525-4d26-4b0e-ba66-434b189645d1/init-config-reloader/0.log" Apr 22 18:44:13.298556 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.298477 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6mhcz_95eeecd5-1c23-4cde-b52e-0531263ac4bd/kube-state-metrics/0.log" Apr 22 18:44:13.317124 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.317099 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6mhcz_95eeecd5-1c23-4cde-b52e-0531263ac4bd/kube-rbac-proxy-main/0.log" Apr 22 18:44:13.338005 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.337975 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6mhcz_95eeecd5-1c23-4cde-b52e-0531263ac4bd/kube-rbac-proxy-self/0.log" Apr 22 18:44:13.395415 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.395384 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-68nx5_0091f08a-b4fc-4921-b944-8ec2bf7919f7/monitoring-plugin/0.log" Apr 22 18:44:13.425283 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.425251 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7xrr6_715f3b4b-86d6-4cef-adfe-80103d666f2e/node-exporter/0.log" Apr 22 18:44:13.451925 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.451886 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7xrr6_715f3b4b-86d6-4cef-adfe-80103d666f2e/kube-rbac-proxy/0.log" Apr 22 18:44:13.473640 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:13.473615 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7xrr6_715f3b4b-86d6-4cef-adfe-80103d666f2e/init-textfile/0.log" Apr 22 18:44:15.783683 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:15.783654 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/1.log" Apr 22 18:44:15.793963 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:15.793931 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lmv9z_c64def61-a93f-45f9-a89f-c469f37561b6/console-operator/2.log" Apr 22 18:44:16.154317 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.154238 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5fbf9d9-dbvfp_7b23f8da-7138-4b3a-9239-4eb613442800/console/0.log" Apr 22 18:44:16.160718 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.160692 2539 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks"] Apr 22 18:44:16.161070 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.161052 2539 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beb72c82-6907-4043-bea9-b3a01212c1e8" containerName="console" Apr 22 18:44:16.161070 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.161067 2539 state_mem.go:107] "Deleted CPUSet assignment" podUID="beb72c82-6907-4043-bea9-b3a01212c1e8" containerName="console" Apr 22 18:44:16.161150 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.161132 2539 memory_manager.go:356] "RemoveStaleState removing state" podUID="beb72c82-6907-4043-bea9-b3a01212c1e8" containerName="console" Apr 22 18:44:16.164050 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.164029 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.167439 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.167420 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cfxdp\"/\"openshift-service-ca.crt\"" Apr 22 18:44:16.168294 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.168274 2539 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cfxdp\"/\"kube-root-ca.crt\"" Apr 22 18:44:16.168399 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.168279 2539 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cfxdp\"/\"default-dockercfg-m8sx6\"" Apr 22 18:44:16.173775 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.173752 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks"] Apr 22 18:44:16.211499 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.211466 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-whsmr_2e5aae8b-ae1d-46c1-a0ae-77cb6bbe014f/download-server/0.log" Apr 22 18:44:16.242849 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.242797 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-sys\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.242849 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.242851 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-podres\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.243103 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.242992 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-proc\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.243103 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.243051 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs466\" (UniqueName: \"kubernetes.io/projected/585d3c01-2067-41c2-add6-67f7048ebbfc-kube-api-access-vs466\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.243185 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.243112 2539 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-lib-modules\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344001 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.343956 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-sys\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344001 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.343998 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-podres\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344058 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-proc\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344083 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-sys\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344113 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs466\" (UniqueName: \"kubernetes.io/projected/585d3c01-2067-41c2-add6-67f7048ebbfc-kube-api-access-vs466\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344142 2539 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-lib-modules\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344177 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-proc\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344260 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344229 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-podres\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.344475 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.344293 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/585d3c01-2067-41c2-add6-67f7048ebbfc-lib-modules\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.353197 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.353169 2539 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs466\" (UniqueName: \"kubernetes.io/projected/585d3c01-2067-41c2-add6-67f7048ebbfc-kube-api-access-vs466\") pod \"perf-node-gather-daemonset-m5kks\" (UID: \"585d3c01-2067-41c2-add6-67f7048ebbfc\") " pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.475913 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.475872 2539 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:16.596932 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.596874 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-bk65x_26cdbfea-d567-4835-9ba0-b68b5e295c7a/volume-data-source-validator/0.log" Apr 22 18:44:16.597567 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.597546 2539 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks"] Apr 22 18:44:16.600148 ip-10-0-132-165 kubenswrapper[2539]: W0422 18:44:16.600117 2539 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod585d3c01_2067_41c2_add6_67f7048ebbfc.slice/crio-b66efabcaa2aa462b3a79dfd505c19302f69d8bc116776e755a2aa5c78e34b7e WatchSource:0}: Error finding container b66efabcaa2aa462b3a79dfd505c19302f69d8bc116776e755a2aa5c78e34b7e: Status 404 returned error can't find the container with id b66efabcaa2aa462b3a79dfd505c19302f69d8bc116776e755a2aa5c78e34b7e Apr 22 18:44:16.601618 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:16.601600 2539 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:44:17.271269 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.271240 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9z52f_34451fd1-aa5f-4cd7-b1e2-0ef6344977dc/dns/0.log" Apr 22 18:44:17.306638 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.306608 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9z52f_34451fd1-aa5f-4cd7-b1e2-0ef6344977dc/kube-rbac-proxy/0.log" Apr 22 18:44:17.499882 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.499843 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" event={"ID":"585d3c01-2067-41c2-add6-67f7048ebbfc","Type":"ContainerStarted","Data":"2e46c1ab5bf07513db80a570621f793635e0d5d54cd40b5d164da3666040ad11"} Apr 22 18:44:17.499882 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.499883 2539 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" event={"ID":"585d3c01-2067-41c2-add6-67f7048ebbfc","Type":"ContainerStarted","Data":"b66efabcaa2aa462b3a79dfd505c19302f69d8bc116776e755a2aa5c78e34b7e"} Apr 22 18:44:17.500114 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.499978 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:17.527041 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.526927 2539 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" podStartSLOduration=1.5268853789999999 podStartE2EDuration="1.526885379s" podCreationTimestamp="2026-04-22 18:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:17.526141046 +0000 UTC m=+4166.282225621" watchObservedRunningTime="2026-04-22 18:44:17.526885379 +0000 UTC m=+4166.282969952" Apr 22 18:44:17.669162 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:17.669129 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dm5wr_011dd50c-8a4c-425d-bb7b-86836dfc52f7/dns-node-resolver/0.log" Apr 22 18:44:18.337203 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:18.337175 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxs6f_ba45db53-69e8-40c3-8090-98739842b87e/node-ca/0.log" Apr 22 18:44:19.235802 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:19.235771 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b7cbdcdbb-cllk8_961e6502-070a-4f99-9ed6-a2a445aefbb3/router/0.log" Apr 22 18:44:19.678870 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:19.678789 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s6r9h_04320b3d-bb85-4a09-ab7d-5fdc01962b73/serve-healthcheck-canary/0.log" Apr 22 18:44:20.061736 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:20.061704 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-t45p4_65f4e6af-46be-4594-b3fb-54dd4fac761b/insights-operator/0.log" Apr 22 18:44:20.063222 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:20.063198 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-t45p4_65f4e6af-46be-4594-b3fb-54dd4fac761b/insights-operator/1.log" Apr 22 18:44:20.246632 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:20.246599 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xqflj_ca32af5f-5277-45b3-86a5-1640613de6a4/kube-rbac-proxy/0.log" Apr 22 18:44:20.300314 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:20.300276 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xqflj_ca32af5f-5277-45b3-86a5-1640613de6a4/exporter/0.log" Apr 22 18:44:20.349385 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:20.349302 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xqflj_ca32af5f-5277-45b3-86a5-1640613de6a4/extractor/0.log" Apr 22 18:44:22.376510 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:22.376477 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84ffddfb66-mxsvq_6cf17ecc-ffab-4ad3-9b83-1f427318bee0/manager/0.log" Apr 22 18:44:22.398179 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:22.398145 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-cjzn9_ef8f9195-84bc-40db-8f31-73200f673df3/manager/0.log" Apr 22 18:44:22.900016 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:22.899979 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-xcwvx_b9826387-b818-4918-879a-509f8283c2b3/manager/0.log" Apr 22 18:44:23.513325 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:23.513296 2539 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-cfxdp/perf-node-gather-daemonset-m5kks" Apr 22 18:44:27.462721 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:27.462681 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gcd2q_2f845b61-dcdc-4d78-b24e-77b96b4792a1/kube-storage-version-migrator-operator/1.log" Apr 22 18:44:27.464243 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:27.464210 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gcd2q_2f845b61-dcdc-4d78-b24e-77b96b4792a1/kube-storage-version-migrator-operator/0.log" Apr 22 18:44:28.809577 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:28.809546 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/kube-multus-additional-cni-plugins/0.log" Apr 22 18:44:28.848318 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:28.848290 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/egress-router-binary-copy/0.log" Apr 22 18:44:28.884770 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:28.884738 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/cni-plugins/0.log" Apr 22 18:44:28.931177 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:28.931151 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/bond-cni-plugin/0.log" Apr 22 18:44:28.972497 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:28.972466 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/routeoverride-cni/0.log" Apr 22 18:44:29.027875 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:29.027852 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/whereabouts-cni-bincopy/0.log" Apr 22 18:44:29.063632 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:29.063548 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gk5m_24bafee9-4454-456f-b2aa-04131f945624/whereabouts-cni/0.log" Apr 22 18:44:29.390449 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:29.390364 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4jd9_f0f702a5-e6ca-4251-9925-a6fc437042f8/kube-multus/0.log" Apr 22 18:44:29.488205 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:29.488177 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8bxsz_a15342ff-f78f-4d33-aed1-0e9c86dbdb15/network-metrics-daemon/0.log" Apr 22 18:44:29.533044 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:29.533007 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8bxsz_a15342ff-f78f-4d33-aed1-0e9c86dbdb15/kube-rbac-proxy/0.log" Apr 22 18:44:30.559646 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.559610 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/ovn-controller/0.log" Apr 22 18:44:30.623348 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.623317 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/ovn-acl-logging/0.log" Apr 22 18:44:30.667960 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.667926 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/kube-rbac-proxy-node/0.log" Apr 22 18:44:30.712957 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.712925 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:44:30.758461 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.758429 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/northd/0.log" Apr 22 18:44:30.804749 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.804725 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/nbdb/0.log" Apr 22 18:44:30.844520 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:30.844442 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/sbdb/0.log" Apr 22 18:44:31.003941 ip-10-0-132-165 kubenswrapper[2539]: I0422 18:44:31.003893 2539 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-dtf75_a69dead2-c622-4a13-a5b4-5367b68c10a8/ovnkube-controller/0.log"