Apr 22 15:08:46.022754 ip-10-0-141-167 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:08:46.471704 ip-10-0-141-167 kubenswrapper[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:46.471704 ip-10-0-141-167 kubenswrapper[2559]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:08:46.471704 ip-10-0-141-167 kubenswrapper[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:46.471704 ip-10-0-141-167 kubenswrapper[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:08:46.471704 ip-10-0-141-167 kubenswrapper[2559]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:46.472545 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.472448 2559 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:08:46.474741 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474726 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474742 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474748 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474752 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474755 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474758 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474761 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474763 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474766 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474769 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474771 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474774 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474777 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474780 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:46.474776 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474783 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474786 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474788 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474791 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474793 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474796 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474803 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474806 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474808 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474811 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474813 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474816 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474818 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474821 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474823 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474826 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474828 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474831 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474834 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474836 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:46.475100 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474839 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474841 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474844 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474848 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474851 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474854 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474857 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474860 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474862 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474865 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474868 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474870 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474873 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474876 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474879 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474882 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474884 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474887 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474890 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474892 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:46.475596 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474894 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474897 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474899 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474902 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474905 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474908 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474910 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474913 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474916 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474918 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474922 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474924 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474927 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474930 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474933 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474935 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474938 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474941 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474943 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474946 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:46.476141 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474948 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474951 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474953 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474956 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474958 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474961 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474965 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474968 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474970 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474973 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474976 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.474978 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475373 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475378 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475381 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475385 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475388 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475391 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475394 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475397 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:46.476629 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475400 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475403 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475406 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475408 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475417 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475420 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475422 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475425 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475428 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475430 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475433 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475435 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475438 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475441 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475443 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475446 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475448 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475451 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475454 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475457 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:46.477113 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475459 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475462 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475464 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475467 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475469 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475472 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475475 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475479 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475496 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475500 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475502 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475505 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475508 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475511 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475513 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475516 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475518 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475522 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475524 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475528 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:46.477612 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475532 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475535 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475538 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475540 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475543 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475546 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475549 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475552 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475554 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475557 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475560 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475563 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475567 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475569 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475572 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475575 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475578 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475580 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475583 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475585 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:46.478090 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475588 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475590 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475593 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475595 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475598 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475601 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475604 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475607 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475609 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475612 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475615 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475618 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475620 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475623 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475625 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475628 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475630 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.475633 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476456 2559 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476464 2559 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:08:46.478582 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476472 2559 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476477 2559 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476481 2559 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476497 2559 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476502 2559 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476507 2559 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476510 2559 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476513 2559 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476517 2559 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476520 2559 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476523 2559 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476526 2559 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476530 2559 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476533 2559 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476535 2559 flags.go:64] FLAG: --cloud-config="" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476538 2559 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476541 2559 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476545 2559 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476548 2559 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476552 2559 flags.go:64] FLAG: --config-dir="" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476555 2559 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476558 2559 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476562 2559 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476566 2559 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:08:46.479062 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476569 2559 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476572 2559 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476575 2559 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476578 2559 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476581 2559 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476584 2559 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476587 2559 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476592 2559 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476595 2559 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476599 2559 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476602 2559 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476605 2559 flags.go:64] FLAG: --enable-server="true" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476608 2559 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476613 2559 flags.go:64] FLAG: --event-burst="100" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476616 2559 flags.go:64] FLAG: --event-qps="50" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476619 2559 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476623 2559 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476626 2559 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476630 2559 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476633 2559 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476636 2559 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476639 2559 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476642 2559 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476645 2559 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476648 2559 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:08:46.479673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476651 2559 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476654 2559 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476657 2559 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476659 2559 flags.go:64] FLAG: --feature-gates="" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476663 2559 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476667 2559 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476670 2559 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476673 2559 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476676 2559 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476679 2559 flags.go:64] FLAG: --help="false" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476682 2559 flags.go:64] FLAG: --hostname-override="ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476685 2559 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476688 2559 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476691 2559 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476695 2559 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476698 2559 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476702 2559 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476705 2559 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476721 2559 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476725 2559 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476729 2559 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476732 2559 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476735 2559 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476738 2559 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:08:46.480268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476741 2559 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476744 2559 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476747 2559 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476750 2559 flags.go:64] FLAG: --lock-file="" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476753 2559 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476756 2559 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476759 2559 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476764 2559 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476767 2559 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476770 2559 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476772 2559 flags.go:64] FLAG: --logging-format="text" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476775 2559 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476779 2559 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476782 2559 flags.go:64] FLAG: --manifest-url="" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476785 2559 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476791 2559 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476795 2559 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476800 2559 flags.go:64] FLAG: --max-pods="110" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476803 2559 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476806 2559 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476809 2559 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476812 2559 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476815 2559 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476818 2559 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476822 2559 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:08:46.480891 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476829 2559 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476832 2559 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476845 2559 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476849 2559 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476852 2559 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476858 2559 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476861 2559 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476865 2559 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476868 2559 flags.go:64] FLAG: --port="10250" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476871 2559 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476873 2559 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c886a5c28eecf28b" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476877 2559 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476880 2559 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476883 2559 flags.go:64] FLAG: --register-node="true" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476886 2559 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476888 2559 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476892 2559 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476895 2559 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476898 2559 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476901 2559 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476905 2559 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476908 2559 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476911 2559 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476914 2559 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476917 2559 flags.go:64] FLAG: --runonce="false" Apr 22 15:08:46.481496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476920 2559 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476923 2559 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476927 2559 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476929 2559 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476932 2559 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476935 2559 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476939 2559 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476942 2559 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476945 2559 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476948 2559 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476950 2559 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476953 2559 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476956 2559 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476959 2559 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476962 2559 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476968 2559 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476971 2559 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476974 2559 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476977 2559 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476981 2559 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476984 2559 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476986 2559 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476989 2559 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476992 2559 flags.go:64] FLAG: --v="2" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.476997 2559 flags.go:64] FLAG: --version="false" Apr 22 15:08:46.482151 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.477001 2559 flags.go:64] FLAG: --vmodule="" Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.477004 2559 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.477008 2559 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477096 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477102 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477106 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477109 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477113 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477116 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477119 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477122 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477125 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477128 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477134 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477136 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477139 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477141 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477144 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477147 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477149 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:46.482775 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477152 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477155 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477158 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477160 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477163 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477165 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477168 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477170 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477173 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477176 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477178 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477181 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477183 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477187 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477191 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477193 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477198 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477200 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477203 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:46.483269 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477206 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477213 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477215 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477218 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477220 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477223 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477227 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477230 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477232 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477235 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477238 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477240 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477243 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477245 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477248 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477251 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477253 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477255 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477258 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477260 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:46.483816 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477263 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477265 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477268 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477271 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477273 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477276 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477278 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477281 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477283 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477287 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477291 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477294 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477297 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477300 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477308 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477311 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477313 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477316 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477320 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477323 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:46.484339 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477325 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477328 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477330 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477333 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477335 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477338 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477340 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477343 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477346 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.477348 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:46.484839 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.477937 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.484880 2559 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.484897 2559 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484948 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484953 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484956 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484959 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484962 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484965 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484968 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484971 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484973 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484976 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484979 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484982 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484985 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484988 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484991 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484994 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484996 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:46.485102 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.484999 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485001 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485004 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485006 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485010 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485012 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485015 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485017 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485020 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485023 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485026 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485028 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485031 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485034 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485037 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485040 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485043 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485047 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485051 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485054 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:46.485607 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485057 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485059 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485062 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485064 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485067 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485069 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485072 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485074 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485077 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485081 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485085 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485089 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485092 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485095 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485098 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485101 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485104 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485106 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485110 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:46.486092 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485113 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485115 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485118 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485121 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485123 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485126 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485129 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485132 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485134 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485137 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485140 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485143 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485146 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485148 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485151 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485154 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485156 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485159 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485161 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485164 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:46.486569 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485167 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485170 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485173 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485176 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485178 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485181 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485184 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485186 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485189 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485191 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.485197 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485310 2559 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485315 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485318 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485321 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:46.487061 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485324 2559 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485327 2559 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485330 2559 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485333 2559 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485336 2559 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485338 2559 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485341 2559 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485344 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485347 2559 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485349 2559 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485352 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485355 2559 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485357 2559 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485360 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485362 2559 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485365 2559 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485367 2559 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485370 2559 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485372 2559 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485375 2559 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:46.487436 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485377 2559 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485380 2559 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485383 2559 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485386 2559 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485388 2559 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485391 2559 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485393 2559 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485396 2559 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485399 2559 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485401 2559 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485403 2559 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485406 2559 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485408 2559 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485411 2559 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485413 2559 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485416 2559 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485418 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485421 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485423 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485426 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:46.487941 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485429 2559 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485433 2559 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485436 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485438 2559 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485441 2559 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485443 2559 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485446 2559 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485448 2559 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485451 2559 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485453 2559 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485456 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485458 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485461 2559 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485464 2559 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485469 2559 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485472 2559 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485474 2559 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485477 2559 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485479 2559 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485481 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:46.488416 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485500 2559 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485505 2559 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485508 2559 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485513 2559 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485516 2559 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485518 2559 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485521 2559 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485524 2559 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485527 2559 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485529 2559 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485533 2559 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485535 2559 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485539 2559 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485541 2559 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485544 2559 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485546 2559 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485549 2559 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485551 2559 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485554 2559 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:46.488919 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485558 2559 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485561 2559 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:46.485564 2559 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.485568 2559 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.486286 2559 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.488309 2559 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.489217 2559 server.go:1019] "Starting client certificate rotation" Apr 22 15:08:46.489379 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.489314 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:46.489982 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.489969 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:46.513959 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.513942 2559 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:46.521635 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.521606 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:46.536322 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.536300 2559 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:08:46.541300 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.541284 2559 log.go:25] "Validated CRI v1 image API" Apr 22 15:08:46.542594 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.542573 2559 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:08:46.546985 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.546966 2559 fs.go:135] Filesystem UUIDs: map[170be377-b798-428e-b4cc-2419a1aa701c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a4beff2e-27f2-44ac-8cae-2619564454b2:/dev/nvme0n1p3] Apr 22 15:08:46.547045 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.546986 2559 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:08:46.550338 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.550321 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:46.552630 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.552503 2559 manager.go:217] Machine: {Timestamp:2026-04-22 15:08:46.550670892 +0000 UTC m=+0.410552198 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099064 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29bf45a81d2d2dcbffddac7159d39f SystemUUID:ec29bf45-a81d-2d2d-cbff-ddac7159d39f BootID:0b1ccd26-915f-4c39-aed8-b6f23521e552 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8f:c3:37:a3:cd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8f:c3:37:a3:cd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:73:51:e8:6d:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:08:46.552630 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.552624 2559 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:08:46.552743 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.552700 2559 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:08:46.555310 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.555285 2559 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:08:46.555451 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.555312 2559 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-167.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:08:46.555513 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.555460 2559 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:08:46.555513 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.555468 2559 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:08:46.555565 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.555480 2559 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:46.556273 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.556263 2559 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:46.557136 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.557126 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:46.557279 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.557270 2559 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:08:46.560083 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.560072 2559 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:08:46.560119 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.560087 2559 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:08:46.560119 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.560105 2559 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:08:46.560119 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.560118 2559 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:08:46.560202 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.560127 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:08:46.561196 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.561185 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:46.561234 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.561204 2559 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:46.563879 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.563863 2559 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:08:46.565108 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.565096 2559 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:08:46.566884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566870 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566888 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566895 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566900 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566911 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566917 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566922 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566928 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566934 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566940 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566955 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:08:46.566965 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.566964 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:08:46.567771 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.567761 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:08:46.567771 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.567771 2559 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:08:46.572283 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.572261 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:08:46.572381 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.572319 2559 server.go:1295] "Started kubelet" Apr 22 15:08:46.573357 ip-10-0-141-167 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:08:46.573530 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.573455 2559 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:08:46.573916 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.573730 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:08:46.573916 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.573913 2559 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:08:46.574953 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.574939 2559 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:08:46.576812 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.576797 2559 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:08:46.579545 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.579528 2559 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-167.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:08:46.579545 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.579533 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:08:46.579656 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.579624 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:08:46.579820 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.579801 2559 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:46.580332 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.580314 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:08:46.581204 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.581183 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:46.581546 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.581516 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:08:46.581990 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.581971 2559 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:08:46.582080 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.581994 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:08:46.582168 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.582155 2559 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:08:46.582226 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.582169 2559 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:08:46.582767 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.582751 2559 factory.go:55] Registering systemd factory Apr 22 15:08:46.582851 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.582801 2559 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:08:46.582903 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.582873 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-167.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:08:46.583164 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.583143 2559 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:08:46.583434 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.583420 2559 factory.go:153] Registering CRI-O factory Apr 22 15:08:46.583531 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.583437 2559 factory.go:223] Registration of the crio container factory successfully Apr 22 15:08:46.583531 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.583500 2559 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:08:46.583616 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.583528 2559 factory.go:103] Registering Raw factory Apr 22 15:08:46.583700 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.583550 2559 manager.go:1196] Started watching for new ooms in manager Apr 22 15:08:46.584014 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.583011 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-167.ec2.internal.18a8b6546433b925 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-167.ec2.internal,UID:ip-10-0-141-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-167.ec2.internal,},FirstTimestamp:2026-04-22 15:08:46.572280101 +0000 UTC m=+0.432161414,LastTimestamp:2026-04-22 15:08:46.572280101 +0000 UTC m=+0.432161414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-167.ec2.internal,}" Apr 22 15:08:46.584143 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.584047 2559 manager.go:319] Starting recovery of all containers Apr 22 15:08:46.593629 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.593606 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-276x9" Apr 22 15:08:46.594234 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.594219 2559 manager.go:324] Recovery completed Apr 22 15:08:46.595690 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.595663 2559 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 15:08:46.599941 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.599926 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.604734 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.604720 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.604795 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.604746 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.604795 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.604756 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.605234 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.605218 2559 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:08:46.605234 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.605232 2559 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:08:46.605331 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.605248 2559 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:46.607069 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.607005 2559 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-167.ec2.internal.18a8b6546622ed4f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-167.ec2.internal,UID:ip-10-0-141-167.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-167.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-167.ec2.internal,},FirstTimestamp:2026-04-22 15:08:46.604733775 +0000 UTC m=+0.464615081,LastTimestamp:2026-04-22 15:08:46.604733775 +0000 UTC m=+0.464615081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-167.ec2.internal,}" Apr 22 15:08:46.607627 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.607610 2559 policy_none.go:49] "None policy: Start" Apr 22 15:08:46.607627 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.607627 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:08:46.607725 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.607637 2559 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:08:46.613192 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.613171 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-276x9" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646038 2559 manager.go:341] "Starting Device Plugin manager" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.646064 2559 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646074 2559 server.go:85] "Starting device plugin registration server" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646258 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646267 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646374 2559 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646458 2559 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.646467 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.646923 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:08:46.650579 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.646981 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:46.706631 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.706604 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:08:46.707785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.707761 2559 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:08:46.707785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.707788 2559 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:08:46.707916 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.707805 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:08:46.707916 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.707811 2559 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:08:46.707916 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.707842 2559 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:08:46.711694 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.711674 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:46.747198 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.747154 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.747972 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.747958 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.748049 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.747986 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.748049 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.748000 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.748049 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.748027 2559 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.760767 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.760751 2559 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.760812 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.760769 2559 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-167.ec2.internal\": node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:46.794619 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.794600 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:46.807902 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.807876 2559 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal"] Apr 22 15:08:46.807983 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.807947 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.808691 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.808678 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.808754 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.808702 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.808754 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.808712 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.810055 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810044 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.810205 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810192 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.810268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810220 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.810727 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810713 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.810799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810732 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.810799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810742 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.810799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810742 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.810799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810768 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.810799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.810785 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.811941 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.811926 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.812006 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.811955 2559 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:46.812566 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.812552 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:46.812642 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.812582 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:46.812642 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.812596 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:46.841085 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.841066 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-167.ec2.internal\" not found" node="ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.845340 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.845326 2559 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-167.ec2.internal\" not found" node="ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.884265 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.884243 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.884334 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.884267 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.884334 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.884292 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc91d8338dec73ddfaae16ceba9bdb9c-config\") pod \"kube-apiserver-proxy-ip-10-0-141-167.ec2.internal\" (UID: \"bc91d8338dec73ddfaae16ceba9bdb9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.895336 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.895317 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:46.984428 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984394 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.984428 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984416 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.984646 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984445 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.984646 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984473 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc91d8338dec73ddfaae16ceba9bdb9c-config\") pod \"kube-apiserver-proxy-ip-10-0-141-167.ec2.internal\" (UID: \"bc91d8338dec73ddfaae16ceba9bdb9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.984646 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984532 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc91d8338dec73ddfaae16ceba9bdb9c-config\") pod \"kube-apiserver-proxy-ip-10-0-141-167.ec2.internal\" (UID: \"bc91d8338dec73ddfaae16ceba9bdb9c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.984646 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:46.984538 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9c285a467f3e432eb214510e6f590b8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal\" (UID: \"e9c285a467f3e432eb214510e6f590b8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:46.995421 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:46.995403 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.096332 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.096266 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.143117 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.143091 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:47.147747 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.147729 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:47.196695 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.196671 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.297214 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.297189 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.397789 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.397735 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.489296 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.489270 2559 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:08:47.489873 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.489395 2559 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:47.498413 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.498395 2559 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-167.ec2.internal\" not found" Apr 22 15:08:47.531035 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.531014 2559 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:47.543412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.543393 2559 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:47.560696 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.560675 2559 apiserver.go:52] "Watching apiserver" Apr 22 15:08:47.571270 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.571253 2559 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:08:47.571566 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.571547 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p6k5v","openshift-multus/multus-xsl6l","openshift-multus/network-metrics-daemon-8hzhw","openshift-network-diagnostics/network-check-target-w22tj","openshift-network-operator/iptables-alerter-pppj5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn","openshift-cluster-node-tuning-operator/tuned-85mbp","openshift-image-registry/node-ca-tqshj","openshift-multus/multus-additional-cni-plugins-2scbl","openshift-ovn-kubernetes/ovnkube-node-4bfdn","kube-system/konnectivity-agent-7lh45"] Apr 22 15:08:47.573749 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.573734 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.574859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.574813 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.575896 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.575877 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.575979 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.575951 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.576621 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.576603 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:08:47.576715 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.576602 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.576997 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.576983 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.577896 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.577880 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:08:47.578156 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.578133 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.578561 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.578544 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.579285 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579263 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w64wz\"" Apr 22 15:08:47.579451 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579433 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.579938 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579659 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:08:47.579938 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579799 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.579938 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579805 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:08:47.579938 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579895 2559 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:47.580157 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.579895 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.580214 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580185 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-42w7s\"" Apr 22 15:08:47.580214 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580205 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.580310 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580262 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.580520 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580500 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" Apr 22 15:08:47.580709 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580685 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.580811 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580714 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:08:47.580811 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580717 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:08:47.580811 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580791 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-txldn\"" Apr 22 15:08:47.580969 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580696 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.580969 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580931 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.580969 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.580944 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:08:47.581115 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.581012 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5g8cc\"" Apr 22 15:08:47.581115 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.581022 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.581115 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.581076 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cg6jl\"" Apr 22 15:08:47.581262 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.581126 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.582164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.582142 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.583327 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.583311 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.583403 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.583386 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:47.584346 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584327 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:08:47.584346 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584339 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:08:47.584477 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584338 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:08:47.584477 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584384 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.584477 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584392 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-82vnw\"" Apr 22 15:08:47.584694 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584672 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:47.584796 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.584732 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:47.584796 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.584755 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.585600 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.585559 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-g4ghc\"" Apr 22 15:08:47.585600 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.585575 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:08:47.585600 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.585583 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dmdzt\"" Apr 22 15:08:47.585600 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.585595 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:08:47.585798 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.585575 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587671 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-system-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587722 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60e4ccd4-2561-4992-af9d-bf593b0e0685-iptables-alerter-script\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587753 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-var-lib-kubelet\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587778 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-tuned\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587901 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-tmp\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587999 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b2fpn\"" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.587934 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-etc-kubernetes\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588070 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e4ccd4-2561-4992-af9d-bf593b0e0685-host-slash\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588105 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d54da891-e22c-4f12-9dfb-e2ada408f1ea-hosts-file\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588209 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:08:47.588528 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588306 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588594 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588638 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-registration-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588681 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69f41c04-4f2b-476d-aac0-a60437dfe0c5-host\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588713 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69f41c04-4f2b-476d-aac0-a60437dfe0c5-serviceca\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588786 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-systemd\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588823 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-run\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588851 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588888 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588961 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-ovn\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589024 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.588995 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovn-node-metrics-cert\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589063 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-multus\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589124 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-kubelet\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589201 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589261 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d54da891-e22c-4f12-9dfb-e2ada408f1ea-tmp-dir\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589283 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-systemd-units\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589315 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-var-lib-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589358 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-etc-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589382 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589425 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-daemon-config\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.589452 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589440 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-host\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589576 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-socket-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589634 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589665 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-sys\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589693 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-etc-selinux\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589733 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svr5\" (UniqueName: \"kubernetes.io/projected/69f41c04-4f2b-476d-aac0-a60437dfe0c5-kube-api-access-6svr5\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589770 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf285aa1-3778-4781-989b-35abe7a6ed8f-konnectivity-ca\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589810 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-netns\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589839 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-bin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.589899 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589868 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smzn8\" (UniqueName: \"kubernetes.io/projected/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-kube-api-access-smzn8\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589895 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-conf\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.589989 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-cnibin\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590011 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-bin\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590045 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-kubernetes\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590090 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdm9l\" (UniqueName: \"kubernetes.io/projected/d54da891-e22c-4f12-9dfb-e2ada408f1ea-kube-api-access-zdm9l\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590126 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-device-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590149 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-env-overrides\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590188 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqk5\" (UniqueName: \"kubernetes.io/projected/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-kube-api-access-jbqk5\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590228 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.590286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590263 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-hostroot\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590303 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-multus-certs\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590336 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysconfig\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590363 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-lib-modules\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590391 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-netns\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590421 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-systemd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590444 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf285aa1-3778-4781-989b-35abe7a6ed8f-agent-certs\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590540 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdwxt\" (UniqueName: \"kubernetes.io/projected/60e4ccd4-2561-4992-af9d-bf593b0e0685-kube-api-access-hdwxt\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590615 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-sys-fs\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590670 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-node-log\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.590717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590707 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-k8s-cni-cncf-io\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590734 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-conf-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590770 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97b5\" (UniqueName: \"kubernetes.io/projected/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-kube-api-access-v97b5\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590799 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-os-release\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590826 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590860 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-config\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590889 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-socket-dir-parent\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590919 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-log-socket\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590948 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-netd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.590989 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldghw\" (UniqueName: \"kubernetes.io/projected/89d42380-0740-498a-b66f-bfbe3da83afe-kube-api-access-ldghw\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591031 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cnibin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591098 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-os-release\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591662 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591645 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cni-binary-copy\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.591701 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591676 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-modprobe-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.591745 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591695 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.591795 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591763 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-kubelet\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591843 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591785 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.591843 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591822 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-system-cni-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.591941 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591844 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp57p\" (UniqueName: \"kubernetes.io/projected/b77d43e4-313e-4cdc-9223-86662961daf5-kube-api-access-gp57p\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.591941 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591887 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-slash\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.591941 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.591917 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-script-lib\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.597921 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.597899 2559 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:47.599180 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.599162 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:47.599263 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.599244 2559 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" Apr 22 15:08:47.599618 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.599600 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal"] Apr 22 15:08:47.614976 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.614938 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:03:46 +0000 UTC" deadline="2027-10-08 15:03:31.987846049 +0000 UTC" Apr 22 15:08:47.614976 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.614973 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12815h54m44.372875703s" Apr 22 15:08:47.623900 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.623881 2559 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:47.624147 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.624127 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal"] Apr 22 15:08:47.645703 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.645679 2559 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrvpm" Apr 22 15:08:47.663232 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.663205 2559 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrvpm" Apr 22 15:08:47.666990 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.666970 2559 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:47.676044 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.676019 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc91d8338dec73ddfaae16ceba9bdb9c.slice/crio-3f9ff3086f8fc3a5e9234e8cc5cdfc1fb08a289c6954e26af35b7d7534450618 WatchSource:0}: Error finding container 3f9ff3086f8fc3a5e9234e8cc5cdfc1fb08a289c6954e26af35b7d7534450618: Status 404 returned error can't find the container with id 3f9ff3086f8fc3a5e9234e8cc5cdfc1fb08a289c6954e26af35b7d7534450618 Apr 22 15:08:47.676253 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.676235 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c285a467f3e432eb214510e6f590b8.slice/crio-6f98f1ec4a0994201852db54776f207fadc61f9524a24c1b50bcdddbaaf3da69 WatchSource:0}: Error finding container 6f98f1ec4a0994201852db54776f207fadc61f9524a24c1b50bcdddbaaf3da69: Status 404 returned error can't find the container with id 6f98f1ec4a0994201852db54776f207fadc61f9524a24c1b50bcdddbaaf3da69 Apr 22 15:08:47.680433 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.680416 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:08:47.682460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.682359 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:08:47.692885 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692865 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.692976 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692900 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.692976 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692925 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-ovn\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.692976 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692950 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovn-node-metrics-cert\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692974 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-multus\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.692997 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-kubelet\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693022 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693046 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693075 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-multus\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693074 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-ovn\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693102 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d54da891-e22c-4f12-9dfb-e2ada408f1ea-tmp-dir\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693117 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-kubelet\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693138 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-systemd-units\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693237 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-systemd-units\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693271 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-var-lib-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693304 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-etc-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693335 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693348 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-var-lib-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693361 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-daemon-config\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693385 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-etc-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693386 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-host\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693421 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693429 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-socket-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693434 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-host\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693432 2559 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693454 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.693460 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693421 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693478 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-sys\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693521 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-etc-selinux\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693548 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6svr5\" (UniqueName: \"kubernetes.io/projected/69f41c04-4f2b-476d-aac0-a60437dfe0c5-kube-api-access-6svr5\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693556 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-ovn-kubernetes\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693574 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf285aa1-3778-4781-989b-35abe7a6ed8f-konnectivity-ca\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693580 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-socket-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693550 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d54da891-e22c-4f12-9dfb-e2ada408f1ea-tmp-dir\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693658 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-netns\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693710 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-bin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693738 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smzn8\" (UniqueName: \"kubernetes.io/projected/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-kube-api-access-smzn8\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693767 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-netns\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693778 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-var-lib-cni-bin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693782 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-etc-selinux\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693781 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-conf\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693825 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-cnibin\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693851 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-bin\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693878 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-kubernetes\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693869 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693932 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-cnibin\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693934 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-daemon-config\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693957 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-kubernetes\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693964 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-bin\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.693986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdm9l\" (UniqueName: \"kubernetes.io/projected/d54da891-e22c-4f12-9dfb-e2ada408f1ea-kube-api-access-zdm9l\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694022 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-sys\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694029 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw945\" (UniqueName: \"kubernetes.io/projected/a5671169-29ab-4157-acb0-207c8e83501a-kube-api-access-nw945\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694052 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-device-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694073 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-env-overrides\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694097 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqk5\" (UniqueName: \"kubernetes.io/projected/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-kube-api-access-jbqk5\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694109 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-conf\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694136 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694157 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-device-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694158 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-hostroot\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694218 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-multus-certs\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.694980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694252 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694265 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysconfig\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694303 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-hostroot\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694319 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysconfig\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694304 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-multus-certs\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694342 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-lib-modules\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694370 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-netns\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694403 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-systemd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694343 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bf285aa1-3778-4781-989b-35abe7a6ed8f-konnectivity-ca\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694426 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf285aa1-3778-4781-989b-35abe7a6ed8f-agent-certs\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694460 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-run-netns\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694470 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdwxt\" (UniqueName: \"kubernetes.io/projected/60e4ccd4-2561-4992-af9d-bf593b0e0685-kube-api-access-hdwxt\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694464 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-systemd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694514 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-sys-fs\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694540 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-node-log\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694615 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-k8s-cni-cncf-io\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694649 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-conf-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694675 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v97b5\" (UniqueName: \"kubernetes.io/projected/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-kube-api-access-v97b5\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.695761 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694723 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-os-release\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694752 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694763 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-env-overrides\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694778 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-config\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694765 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-host-run-k8s-cni-cncf-io\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694832 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-node-log\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694834 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-sys-fs\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694843 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-socket-dir-parent\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694883 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-conf-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694889 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694940 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-log-socket\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694931 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-lib-modules\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694952 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-os-release\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.694986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-netd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695014 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-multus-socket-dir-parent\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695015 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-run-openvswitch\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695013 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldghw\" (UniqueName: \"kubernetes.io/projected/89d42380-0740-498a-b66f-bfbe3da83afe-kube-api-access-ldghw\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.696601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695071 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-log-socket\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695076 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-cni-netd\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695087 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cnibin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695186 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-os-release\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695209 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cni-binary-copy\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695264 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-os-release\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695271 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cnibin\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695334 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-modprobe-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695401 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695458 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-kubelet\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695481 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695512 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-modprobe-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695532 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-system-cni-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695571 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-kubelet\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695619 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695626 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/89d42380-0740-498a-b66f-bfbe3da83afe-system-cni-dir\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695648 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp57p\" (UniqueName: \"kubernetes.io/projected/b77d43e4-313e-4cdc-9223-86662961daf5-kube-api-access-gp57p\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695670 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-slash\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695693 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-script-lib\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695707 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-config\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695726 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-system-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695787 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60e4ccd4-2561-4992-af9d-bf593b0e0685-iptables-alerter-script\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695830 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-var-lib-kubelet\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695830 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-cni-binary-copy\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695702 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-sysctl-d\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695874 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-host-slash\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.695939 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-var-lib-kubelet\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696009 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-system-cni-dir\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696062 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-tuned\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696091 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-tmp\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696131 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-etc-kubernetes\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696156 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e4ccd4-2561-4992-af9d-bf593b0e0685-host-slash\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696170 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d54da891-e22c-4f12-9dfb-e2ada408f1ea-hosts-file\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696186 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696202 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-registration-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696217 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69f41c04-4f2b-476d-aac0-a60437dfe0c5-host\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.697810 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696231 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69f41c04-4f2b-476d-aac0-a60437dfe0c5-serviceca\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696245 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-systemd\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696258 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-run\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696313 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-run\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696392 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovnkube-script-lib\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696424 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-ovn-node-metrics-cert\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696466 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696530 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e4ccd4-2561-4992-af9d-bf593b0e0685-host-slash\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696539 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d54da891-e22c-4f12-9dfb-e2ada408f1ea-hosts-file\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696589 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69f41c04-4f2b-476d-aac0-a60437dfe0c5-host\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696609 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-etc-kubernetes\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696617 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/89d42380-0740-498a-b66f-bfbe3da83afe-cni-binary-copy\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696651 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b77d43e4-313e-4cdc-9223-86662961daf5-registration-dir\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696740 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-systemd\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.696854 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/60e4ccd4-2561-4992-af9d-bf593b0e0685-iptables-alerter-script\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.697025 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bf285aa1-3778-4781-989b-35abe7a6ed8f-agent-certs\") pod \"konnectivity-agent-7lh45\" (UID: \"bf285aa1-3778-4781-989b-35abe7a6ed8f\") " pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.698295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.697053 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69f41c04-4f2b-476d-aac0-a60437dfe0c5-serviceca\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.698723 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.698393 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-tmp\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.698861 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.698844 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-etc-tuned\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.706391 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.706372 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdm9l\" (UniqueName: \"kubernetes.io/projected/d54da891-e22c-4f12-9dfb-e2ada408f1ea-kube-api-access-zdm9l\") pod \"node-resolver-p6k5v\" (UID: \"d54da891-e22c-4f12-9dfb-e2ada408f1ea\") " pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.711933 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.711907 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldghw\" (UniqueName: \"kubernetes.io/projected/89d42380-0740-498a-b66f-bfbe3da83afe-kube-api-access-ldghw\") pod \"multus-additional-cni-plugins-2scbl\" (UID: \"89d42380-0740-498a-b66f-bfbe3da83afe\") " pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.712114 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.712096 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp57p\" (UniqueName: \"kubernetes.io/projected/b77d43e4-313e-4cdc-9223-86662961daf5-kube-api-access-gp57p\") pod \"aws-ebs-csi-driver-node-kmhfn\" (UID: \"b77d43e4-313e-4cdc-9223-86662961daf5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.712405 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.712372 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97b5\" (UniqueName: \"kubernetes.io/projected/e378e9f4-6a56-4fe5-81f1-3a9d2996d62d-kube-api-access-v97b5\") pod \"tuned-85mbp\" (UID: \"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d\") " pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.712582 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.712557 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdwxt\" (UniqueName: \"kubernetes.io/projected/60e4ccd4-2561-4992-af9d-bf593b0e0685-kube-api-access-hdwxt\") pod \"iptables-alerter-pppj5\" (UID: \"60e4ccd4-2561-4992-af9d-bf593b0e0685\") " pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.712991 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.712813 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" event={"ID":"bc91d8338dec73ddfaae16ceba9bdb9c","Type":"ContainerStarted","Data":"3f9ff3086f8fc3a5e9234e8cc5cdfc1fb08a289c6954e26af35b7d7534450618"} Apr 22 15:08:47.713292 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.713226 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqk5\" (UniqueName: \"kubernetes.io/projected/34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee-kube-api-access-jbqk5\") pod \"ovnkube-node-4bfdn\" (UID: \"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee\") " pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.715348 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.715324 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" event={"ID":"e9c285a467f3e432eb214510e6f590b8","Type":"ContainerStarted","Data":"6f98f1ec4a0994201852db54776f207fadc61f9524a24c1b50bcdddbaaf3da69"} Apr 22 15:08:47.715683 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.715664 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svr5\" (UniqueName: \"kubernetes.io/projected/69f41c04-4f2b-476d-aac0-a60437dfe0c5-kube-api-access-6svr5\") pod \"node-ca-tqshj\" (UID: \"69f41c04-4f2b-476d-aac0-a60437dfe0c5\") " pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.715890 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.715873 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smzn8\" (UniqueName: \"kubernetes.io/projected/1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde-kube-api-access-smzn8\") pod \"multus-xsl6l\" (UID: \"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde\") " pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.796795 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.796774 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw945\" (UniqueName: \"kubernetes.io/projected/a5671169-29ab-4157-acb0-207c8e83501a-kube-api-access-nw945\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.796919 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.796814 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:47.797000 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.796978 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.797105 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.797092 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:47.797169 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.797161 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:08:48.297132339 +0000 UTC m=+2.157013637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:47.805805 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.805786 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:47.805805 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.805806 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:47.805927 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.805815 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:47.805927 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:47.805859 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:48.305846233 +0000 UTC m=+2.165727527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:47.810327 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.810311 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw945\" (UniqueName: \"kubernetes.io/projected/a5671169-29ab-4157-acb0-207c8e83501a-kube-api-access-nw945\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:47.903648 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.903582 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xsl6l" Apr 22 15:08:47.909462 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.909440 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba654ca_0d27_4e4d_b6e5_6f5d49a54cde.slice/crio-533ea3e2e62864f9a51babe31c754662fa6235f62d73fd038f159a72a82f1d98 WatchSource:0}: Error finding container 533ea3e2e62864f9a51babe31c754662fa6235f62d73fd038f159a72a82f1d98: Status 404 returned error can't find the container with id 533ea3e2e62864f9a51babe31c754662fa6235f62d73fd038f159a72a82f1d98 Apr 22 15:08:47.915176 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.915161 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pppj5" Apr 22 15:08:47.916413 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.916390 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" Apr 22 15:08:47.921594 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.921571 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e4ccd4_2561_4992_af9d_bf593b0e0685.slice/crio-2e969ad1e6d897ae21463239c31ff3144b3bb4b71d10748e931876ca6f16d954 WatchSource:0}: Error finding container 2e969ad1e6d897ae21463239c31ff3144b3bb4b71d10748e931876ca6f16d954: Status 404 returned error can't find the container with id 2e969ad1e6d897ae21463239c31ff3144b3bb4b71d10748e931876ca6f16d954 Apr 22 15:08:47.924082 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.924066 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77d43e4_313e_4cdc_9223_86662961daf5.slice/crio-f9c18a7a1c7e8b53e1b242f054f8a960bcd52c263a90b716137093909a7d1cad WatchSource:0}: Error finding container f9c18a7a1c7e8b53e1b242f054f8a960bcd52c263a90b716137093909a7d1cad: Status 404 returned error can't find the container with id f9c18a7a1c7e8b53e1b242f054f8a960bcd52c263a90b716137093909a7d1cad Apr 22 15:08:47.945408 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.945387 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-85mbp" Apr 22 15:08:47.950156 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.950137 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode378e9f4_6a56_4fe5_81f1_3a9d2996d62d.slice/crio-14150e6a132f6ff14dafc2e7e007711e0cabe371b3e583402b1ce72fc95d7492 WatchSource:0}: Error finding container 14150e6a132f6ff14dafc2e7e007711e0cabe371b3e583402b1ce72fc95d7492: Status 404 returned error can't find the container with id 14150e6a132f6ff14dafc2e7e007711e0cabe371b3e583402b1ce72fc95d7492 Apr 22 15:08:47.958684 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.958669 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tqshj" Apr 22 15:08:47.963499 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.963463 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f41c04_4f2b_476d_aac0_a60437dfe0c5.slice/crio-a99e6015bc277cc0a06662c934bff5387b2a00a57c7581da8b0df4f90cd55ffd WatchSource:0}: Error finding container a99e6015bc277cc0a06662c934bff5387b2a00a57c7581da8b0df4f90cd55ffd: Status 404 returned error can't find the container with id a99e6015bc277cc0a06662c934bff5387b2a00a57c7581da8b0df4f90cd55ffd Apr 22 15:08:47.976566 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.976549 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2scbl" Apr 22 15:08:47.981839 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.981822 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:08:47.981913 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.981850 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d42380_0740_498a_b66f_bfbe3da83afe.slice/crio-fd298fe362e6cb0ef41ddbd31531178ab1b9b6de14738be574470c68d68c24ae WatchSource:0}: Error finding container fd298fe362e6cb0ef41ddbd31531178ab1b9b6de14738be574470c68d68c24ae: Status 404 returned error can't find the container with id fd298fe362e6cb0ef41ddbd31531178ab1b9b6de14738be574470c68d68c24ae Apr 22 15:08:47.987192 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.987175 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b4944f_8ab6_4c2c_9fe3_0a1b6099a7ee.slice/crio-13b13e9a91010de368ce87288d246adff5edd4a23e2bc3d6fd2cba79b1b48a1f WatchSource:0}: Error finding container 13b13e9a91010de368ce87288d246adff5edd4a23e2bc3d6fd2cba79b1b48a1f: Status 404 returned error can't find the container with id 13b13e9a91010de368ce87288d246adff5edd4a23e2bc3d6fd2cba79b1b48a1f Apr 22 15:08:47.987930 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.987914 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:08:47.992239 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:47.992224 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p6k5v" Apr 22 15:08:47.993247 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.993225 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf285aa1_3778_4781_989b_35abe7a6ed8f.slice/crio-3563310326b0b2a34b04b35d9a377211b3ed7f6a80bff45f3a6ee8b4952427e5 WatchSource:0}: Error finding container 3563310326b0b2a34b04b35d9a377211b3ed7f6a80bff45f3a6ee8b4952427e5: Status 404 returned error can't find the container with id 3563310326b0b2a34b04b35d9a377211b3ed7f6a80bff45f3a6ee8b4952427e5 Apr 22 15:08:47.999456 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:08:47.999436 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54da891_e22c_4f12_9dfb_e2ada408f1ea.slice/crio-a1d97e0753d2a9be5b58e1352d1efc57a864e5e9ed0a45021f2b063aaee60047 WatchSource:0}: Error finding container a1d97e0753d2a9be5b58e1352d1efc57a864e5e9ed0a45021f2b063aaee60047: Status 404 returned error can't find the container with id a1d97e0753d2a9be5b58e1352d1efc57a864e5e9ed0a45021f2b063aaee60047 Apr 22 15:08:48.300467 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.300337 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:48.300675 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.300641 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:48.300754 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.300716 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:08:49.300692099 +0000 UTC m=+3.160573396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:48.348501 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.348393 2559 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:48.401154 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.401116 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:48.401360 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.401317 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:48.401360 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.401341 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:48.401360 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.401353 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:48.401554 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:48.401411 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:49.401392618 +0000 UTC m=+3.261273915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:48.664272 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.664181 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:47 +0000 UTC" deadline="2027-12-09 00:51:12.978312721 +0000 UTC" Apr 22 15:08:48.664272 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.664227 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14289h42m24.314090653s" Apr 22 15:08:48.736315 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.736275 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerStarted","Data":"fd298fe362e6cb0ef41ddbd31531178ab1b9b6de14738be574470c68d68c24ae"} Apr 22 15:08:48.749658 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.749627 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85mbp" event={"ID":"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d","Type":"ContainerStarted","Data":"14150e6a132f6ff14dafc2e7e007711e0cabe371b3e583402b1ce72fc95d7492"} Apr 22 15:08:48.760496 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.760450 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" event={"ID":"b77d43e4-313e-4cdc-9223-86662961daf5","Type":"ContainerStarted","Data":"f9c18a7a1c7e8b53e1b242f054f8a960bcd52c263a90b716137093909a7d1cad"} Apr 22 15:08:48.771567 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.769389 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pppj5" event={"ID":"60e4ccd4-2561-4992-af9d-bf593b0e0685","Type":"ContainerStarted","Data":"2e969ad1e6d897ae21463239c31ff3144b3bb4b71d10748e931876ca6f16d954"} Apr 22 15:08:48.781859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.781810 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"13b13e9a91010de368ce87288d246adff5edd4a23e2bc3d6fd2cba79b1b48a1f"} Apr 22 15:08:48.793893 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.793864 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tqshj" event={"ID":"69f41c04-4f2b-476d-aac0-a60437dfe0c5","Type":"ContainerStarted","Data":"a99e6015bc277cc0a06662c934bff5387b2a00a57c7581da8b0df4f90cd55ffd"} Apr 22 15:08:48.796039 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.796015 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xsl6l" event={"ID":"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde","Type":"ContainerStarted","Data":"533ea3e2e62864f9a51babe31c754662fa6235f62d73fd038f159a72a82f1d98"} Apr 22 15:08:48.801108 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.801053 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p6k5v" event={"ID":"d54da891-e22c-4f12-9dfb-e2ada408f1ea","Type":"ContainerStarted","Data":"a1d97e0753d2a9be5b58e1352d1efc57a864e5e9ed0a45021f2b063aaee60047"} Apr 22 15:08:48.814532 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:48.814481 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7lh45" event={"ID":"bf285aa1-3778-4781-989b-35abe7a6ed8f","Type":"ContainerStarted","Data":"3563310326b0b2a34b04b35d9a377211b3ed7f6a80bff45f3a6ee8b4952427e5"} Apr 22 15:08:49.310652 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.309848 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:49.310652 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.310069 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:49.310652 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.310154 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:08:51.310132389 +0000 UTC m=+5.170013706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:49.412123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.411538 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:49.412123 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.411688 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:49.412123 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.411709 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:49.412123 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.411721 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:49.412123 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.411784 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:51.411757525 +0000 UTC m=+5.271638824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:49.664785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.664672 2559 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:47 +0000 UTC" deadline="2028-01-10 11:03:57.790527968 +0000 UTC" Apr 22 15:08:49.664785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.664709 2559 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15067h55m8.125822151s" Apr 22 15:08:49.708985 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.708961 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:49.709148 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.709066 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:49.709211 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:49.709170 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:49.709267 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:49.709255 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:51.325919 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:51.325842 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:51.326355 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.325972 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:51.326355 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.326048 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:08:55.326028301 +0000 UTC m=+9.185909605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:51.426961 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:51.426922 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:51.427151 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.427109 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:51.427151 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.427135 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:51.427151 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.427149 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:51.427351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.427205 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:55.427187089 +0000 UTC m=+9.287068398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:51.708532 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:51.708415 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:51.708532 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:51.708506 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:51.708738 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.708564 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:51.708738 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:51.708675 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:53.709006 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:53.708975 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:53.709392 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:53.708975 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:53.709392 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:53.709095 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:53.709392 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:53.709148 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:55.355179 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:55.355140 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:55.355649 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.355296 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:55.355649 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.355363 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:09:03.355343354 +0000 UTC m=+17.215224654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:55.455845 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:55.455808 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:55.456012 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.455960 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:55.456012 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.455978 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:55.456012 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.455991 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:55.456178 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.456046 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:03.456027911 +0000 UTC m=+17.315909212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:55.708787 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:55.708750 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:55.708953 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.708896 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:55.709347 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:55.709327 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:55.709448 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:55.709428 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:57.708770 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:57.708732 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:57.709196 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:57.708870 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:57.709196 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:57.708924 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:57.709196 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:57.709049 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:08:59.708463 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:59.708425 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:08:59.708948 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:08:59.708439 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:08:59.708948 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:59.708563 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:08:59.708948 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:08:59.708670 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:01.708633 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:01.708600 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:01.709054 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:01.708600 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:01.709054 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:01.708730 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:01.709054 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:01.708764 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:03.417136 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:03.417095 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:03.417681 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.417238 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:09:03.417681 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.417325 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:09:19.417305779 +0000 UTC m=+33.277187075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:09:03.517706 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:03.517670 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:03.517897 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.517868 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:09:03.517897 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.517893 2559 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:09:03.517997 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.517906 2559 projected.go:194] Error preparing data for projected volume kube-api-access-jkzp9 for pod openshift-network-diagnostics/network-check-target-w22tj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:09:03.517997 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.517969 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9 podName:7fff4b63-00a5-4c92-8135-75133481b228 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:19.517953412 +0000 UTC m=+33.377834723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkzp9" (UniqueName: "kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9") pod "network-check-target-w22tj" (UID: "7fff4b63-00a5-4c92-8135-75133481b228") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:09:03.708977 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:03.708949 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:03.709129 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:03.708951 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:03.709129 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.709050 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:03.709231 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:03.709130 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:05.708000 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.707975 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:05.708275 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.707976 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:05.708275 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:05.708099 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:05.708275 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:05.708130 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:05.871070 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.870860 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" event={"ID":"bc91d8338dec73ddfaae16ceba9bdb9c","Type":"ContainerStarted","Data":"330f4fbe230f1c756bd7d3d02b1bc9f8b99dc476cbd8adb7686b7355d76d2e20"} Apr 22 15:09:05.874501 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.874414 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"916015fb10e5c883f61c022eaf3230d93b188e6055c8a1e8beeca1f312fd80e2"} Apr 22 15:09:05.874501 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.874444 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"791837e699c9c9f4fab1a4025a266ced1cf4549c547bb6b728777662d2682d0a"} Apr 22 15:09:05.879995 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.879939 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xsl6l" event={"ID":"1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde","Type":"ContainerStarted","Data":"c725d9b6deba7740e3e461a75339dd2613c283202c2e3e2ec59bcdef3424451d"} Apr 22 15:09:05.881561 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.881537 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-85mbp" event={"ID":"e378e9f4-6a56-4fe5-81f1-3a9d2996d62d","Type":"ContainerStarted","Data":"14493591631ff123014e6c4f80b82e7bd8a21fa6da216654e73ad9aa8b220aa5"} Apr 22 15:09:05.887999 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.887931 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-167.ec2.internal" podStartSLOduration=18.887918646 podStartE2EDuration="18.887918646s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:05.887627082 +0000 UTC m=+19.747508397" watchObservedRunningTime="2026-04-22 15:09:05.887918646 +0000 UTC m=+19.747799961" Apr 22 15:09:05.906908 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:05.906864 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xsl6l" podStartSLOduration=2.119012362 podStartE2EDuration="19.906850521s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.910821 +0000 UTC m=+1.770702297" lastFinishedPulling="2026-04-22 15:09:05.698659164 +0000 UTC m=+19.558540456" observedRunningTime="2026-04-22 15:09:05.906770126 +0000 UTC m=+19.766651453" watchObservedRunningTime="2026-04-22 15:09:05.906850521 +0000 UTC m=+19.766731837" Apr 22 15:09:06.885029 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.884659 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p6k5v" event={"ID":"d54da891-e22c-4f12-9dfb-e2ada408f1ea","Type":"ContainerStarted","Data":"be499015c097d452b27c7635a29ddab41f42f14bd15ee2b35045d742004a8aa6"} Apr 22 15:09:06.885975 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.885951 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7lh45" event={"ID":"bf285aa1-3778-4781-989b-35abe7a6ed8f","Type":"ContainerStarted","Data":"d0651b2a65a3079e539fa032cb8686435e7da5aa2fa6f827e521da9436a28c73"} Apr 22 15:09:06.887106 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.887084 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="bfec4e3dcc73234cc81b1621d18ce7a67254f4c4b007144ed4a4ea25a053f21d" exitCode=0 Apr 22 15:09:06.887196 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.887151 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"bfec4e3dcc73234cc81b1621d18ce7a67254f4c4b007144ed4a4ea25a053f21d"} Apr 22 15:09:06.888438 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.888364 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" event={"ID":"b77d43e4-313e-4cdc-9223-86662961daf5","Type":"ContainerStarted","Data":"81dce09d821080496aa192253bc3a2990af360a09e5428dafc68912ca2917c52"} Apr 22 15:09:06.889595 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.889572 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pppj5" event={"ID":"60e4ccd4-2561-4992-af9d-bf593b0e0685","Type":"ContainerStarted","Data":"4bde69b7e0f6b98b4ec60a7250bb10f20d01f65a33a042e3df14119747f859c4"} Apr 22 15:09:06.891623 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.891609 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:09:06.891901 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.891880 2559 generic.go:358] "Generic (PLEG): container finished" podID="34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee" containerID="916015fb10e5c883f61c022eaf3230d93b188e6055c8a1e8beeca1f312fd80e2" exitCode=1 Apr 22 15:09:06.891978 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.891943 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerDied","Data":"916015fb10e5c883f61c022eaf3230d93b188e6055c8a1e8beeca1f312fd80e2"} Apr 22 15:09:06.892033 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.891976 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"e33ec159cfa66cd9539e30cde73d93211e5a2da73a2f5d50d0070fcbcd9844e0"} Apr 22 15:09:06.892033 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.891990 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"89a96093cf9c559f18e65db460ba6e00e378c58fff843d022e7b20f889eab631"} Apr 22 15:09:06.892033 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.892005 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"1c983bed32e7f54bb345bbdc187ac51aa492efadaedab128f8400c8d974f8605"} Apr 22 15:09:06.892033 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.892018 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"64c320d91636b58760120df28d077a4f31549bfee0edafc2b7b81d9384b72681"} Apr 22 15:09:06.893093 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.893074 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tqshj" event={"ID":"69f41c04-4f2b-476d-aac0-a60437dfe0c5","Type":"ContainerStarted","Data":"43636bb01d7a91ada6478cbdaa23441450486dd8651b3222a0536a95196047c2"} Apr 22 15:09:06.894224 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.894206 2559 generic.go:358] "Generic (PLEG): container finished" podID="e9c285a467f3e432eb214510e6f590b8" containerID="df4e94486ac105d2d17e1ca9f6c7fbeda6c67f135891f259edfcf49e60924145" exitCode=0 Apr 22 15:09:06.894296 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.894229 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" event={"ID":"e9c285a467f3e432eb214510e6f590b8","Type":"ContainerDied","Data":"df4e94486ac105d2d17e1ca9f6c7fbeda6c67f135891f259edfcf49e60924145"} Apr 22 15:09:06.908373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.908341 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-85mbp" podStartSLOduration=3.316586058 podStartE2EDuration="20.908330896s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.951606877 +0000 UTC m=+1.811488173" lastFinishedPulling="2026-04-22 15:09:05.543351705 +0000 UTC m=+19.403233011" observedRunningTime="2026-04-22 15:09:05.933869385 +0000 UTC m=+19.793750694" watchObservedRunningTime="2026-04-22 15:09:06.908330896 +0000 UTC m=+20.768212218" Apr 22 15:09:06.908698 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.908676 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p6k5v" podStartSLOduration=2.367283011 podStartE2EDuration="19.908671618s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="2026-04-22 15:08:48.001016985 +0000 UTC m=+1.860898289" lastFinishedPulling="2026-04-22 15:09:05.542405584 +0000 UTC m=+19.402286896" observedRunningTime="2026-04-22 15:09:06.907931563 +0000 UTC m=+20.767812877" watchObservedRunningTime="2026-04-22 15:09:06.908671618 +0000 UTC m=+20.768552933" Apr 22 15:09:06.971717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:06.971674 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pppj5" podStartSLOduration=3.3519730770000002 podStartE2EDuration="20.971661529s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.923407136 +0000 UTC m=+1.783288428" lastFinishedPulling="2026-04-22 15:09:05.543095572 +0000 UTC m=+19.402976880" observedRunningTime="2026-04-22 15:09:06.971600659 +0000 UTC m=+20.831481974" watchObservedRunningTime="2026-04-22 15:09:06.971661529 +0000 UTC m=+20.831542845" Apr 22 15:09:07.006835 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.006785 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7lh45" podStartSLOduration=3.4594315 podStartE2EDuration="21.006768155s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.995085993 +0000 UTC m=+1.854967285" lastFinishedPulling="2026-04-22 15:09:05.542422643 +0000 UTC m=+19.402303940" observedRunningTime="2026-04-22 15:09:07.005842107 +0000 UTC m=+20.865723423" watchObservedRunningTime="2026-04-22 15:09:07.006768155 +0000 UTC m=+20.866649473" Apr 22 15:09:07.066013 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.065916 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tqshj" podStartSLOduration=3.532455403 podStartE2EDuration="21.065903561s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.964831467 +0000 UTC m=+1.824712760" lastFinishedPulling="2026-04-22 15:09:05.498279613 +0000 UTC m=+19.358160918" observedRunningTime="2026-04-22 15:09:07.034823079 +0000 UTC m=+20.894704391" watchObservedRunningTime="2026-04-22 15:09:07.065903561 +0000 UTC m=+20.925784875" Apr 22 15:09:07.244787 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.244758 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:09:07.245314 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.245292 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:09:07.486587 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.486568 2559 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:09:07.658319 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.658166 2559 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:09:07.48658237Z","UUID":"558428c5-c1d6-4851-9c6e-185f1ae4cc23","Handler":null,"Name":"","Endpoint":""} Apr 22 15:09:07.659793 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.659744 2559 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:09:07.659793 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.659771 2559 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:09:07.708366 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.708338 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:07.708474 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:07.708430 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:07.708474 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.708345 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:07.708620 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:07.708536 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:07.898049 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.898000 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" event={"ID":"e9c285a467f3e432eb214510e6f590b8","Type":"ContainerStarted","Data":"ca90e658f24c80c7859f780a7e4f1c3873cecfbeddd7142c6f51a0bb605ab204"} Apr 22 15:09:07.899730 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.899707 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" event={"ID":"b77d43e4-313e-4cdc-9223-86662961daf5","Type":"ContainerStarted","Data":"75ba7c4603bb49a97673522a5ab00793466c058ef09e4e8f736f7cd6eb711d01"} Apr 22 15:09:07.900373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.900354 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:09:07.900881 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.900863 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7lh45" Apr 22 15:09:07.920334 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:07.920256 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-167.ec2.internal" podStartSLOduration=20.920245425 podStartE2EDuration="20.920245425s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:07.918289338 +0000 UTC m=+21.778170653" watchObservedRunningTime="2026-04-22 15:09:07.920245425 +0000 UTC m=+21.780126741" Apr 22 15:09:08.904611 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:08.904536 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:09:08.905055 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:08.904992 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"f5a7d73e0edd483a6349e0f616553b7e6f81421c91f6b610b1feb9c8fcf11c19"} Apr 22 15:09:08.906943 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:08.906902 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" event={"ID":"b77d43e4-313e-4cdc-9223-86662961daf5","Type":"ContainerStarted","Data":"5377b13b9a2064db97ecbe2d88faf7dd3a83773b1a3ff3c493c5fe5dc6f57c57"} Apr 22 15:09:08.926963 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:08.926910 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kmhfn" podStartSLOduration=2.356158746 podStartE2EDuration="22.926893326s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.926061092 +0000 UTC m=+1.785942388" lastFinishedPulling="2026-04-22 15:09:08.49679567 +0000 UTC m=+22.356676968" observedRunningTime="2026-04-22 15:09:08.926468217 +0000 UTC m=+22.786349533" watchObservedRunningTime="2026-04-22 15:09:08.926893326 +0000 UTC m=+22.786774643" Apr 22 15:09:09.708937 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:09.708903 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:09.708937 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:09.708941 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:09.709150 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:09.709020 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:09.709212 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:09.709161 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:11.708416 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.708237 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:11.709073 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.708239 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:11.709073 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:11.708482 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:11.709073 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:11.708578 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:11.915004 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.914935 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:09:11.915329 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.915310 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"911746c6bdf192b71ad813e187398f029b19745ddc715e710a4c4b47ef70c458"} Apr 22 15:09:11.915535 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.915518 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:11.915726 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.915708 2559 scope.go:117] "RemoveContainer" containerID="916015fb10e5c883f61c022eaf3230d93b188e6055c8a1e8beeca1f312fd80e2" Apr 22 15:09:11.917032 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.917007 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="1c0435bb95a7e2d9c836e09a41a2126a2133795659c5fdb9f4a5025a70ca3e56" exitCode=0 Apr 22 15:09:11.917122 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.917043 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"1c0435bb95a7e2d9c836e09a41a2126a2133795659c5fdb9f4a5025a70ca3e56"} Apr 22 15:09:11.930470 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:11.930451 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:12.923382 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.923330 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:09:12.923855 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.923794 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" event={"ID":"34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee","Type":"ContainerStarted","Data":"6d7dc61ee8b0d4ddd9f47e0106df13443c85b8007e4220022f2755d218cc1272"} Apr 22 15:09:12.924207 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.924184 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:12.924321 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.924225 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:12.926205 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.926179 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="a4ef971c96d06c7298da3961e7653fed94c9f102a37ef7912c17e7d848ec11bf" exitCode=0 Apr 22 15:09:12.926303 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.926224 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"a4ef971c96d06c7298da3961e7653fed94c9f102a37ef7912c17e7d848ec11bf"} Apr 22 15:09:12.939709 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.939688 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:12.999250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:12.999048 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" podStartSLOduration=9.433629672 podStartE2EDuration="26.999036389s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.988874912 +0000 UTC m=+1.848756205" lastFinishedPulling="2026-04-22 15:09:05.554281624 +0000 UTC m=+19.414162922" observedRunningTime="2026-04-22 15:09:12.969633568 +0000 UTC m=+26.829514883" watchObservedRunningTime="2026-04-22 15:09:12.999036389 +0000 UTC m=+26.858917703" Apr 22 15:09:13.171862 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.171832 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8hzhw"] Apr 22 15:09:13.172010 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.171942 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:13.172050 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:13.172024 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:13.175127 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.175095 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w22tj"] Apr 22 15:09:13.175241 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.175195 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:13.175282 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:13.175264 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:13.930030 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.929946 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="a39f4a5ea1c04eaeeb45ba40da60f23135497b252807c2108d1755b86c190049" exitCode=0 Apr 22 15:09:13.930357 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:13.930029 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"a39f4a5ea1c04eaeeb45ba40da60f23135497b252807c2108d1755b86c190049"} Apr 22 15:09:14.711674 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:14.710273 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:14.711674 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:14.710411 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:14.711674 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:14.710902 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:14.711674 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:14.710996 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:16.710045 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:16.710016 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:16.710708 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:16.710123 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w22tj" podUID="7fff4b63-00a5-4c92-8135-75133481b228" Apr 22 15:09:16.710708 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:16.710216 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:16.710708 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:16.710340 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:09:18.422014 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.421964 2559 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-167.ec2.internal" event="NodeReady" Apr 22 15:09:18.422520 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.422139 2559 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:09:18.477248 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.477217 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cmsst"] Apr 22 15:09:18.497143 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.497121 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ffzqh"] Apr 22 15:09:18.497295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.497277 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.500078 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.500056 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:09:18.500175 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.500076 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:09:18.500376 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.500358 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkblf\"" Apr 22 15:09:18.515342 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.515321 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffzqh"] Apr 22 15:09:18.515342 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.515344 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cmsst"] Apr 22 15:09:18.515519 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.515431 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:18.518369 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.518349 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:09:18.518838 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.518819 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cbqpc\"" Apr 22 15:09:18.518926 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.518845 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:09:18.519323 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.519302 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:09:18.630576 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630497 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d3cf97-71a9-44b9-80ac-439b558c8604-config-volume\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.630576 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630556 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk5wh\" (UniqueName: \"kubernetes.io/projected/b464f475-8761-403b-bc74-03f05e7d52c5-kube-api-access-bk5wh\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:18.630785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630581 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57d3cf97-71a9-44b9-80ac-439b558c8604-tmp-dir\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.630785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630622 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2htr\" (UniqueName: \"kubernetes.io/projected/57d3cf97-71a9-44b9-80ac-439b558c8604-kube-api-access-b2htr\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.630785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630645 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:18.630785 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.630695 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.708527 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.708474 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:18.708700 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.708481 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:18.711303 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.711273 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:09:18.711443 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.711414 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:09:18.711565 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.711548 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4nbn2\"" Apr 22 15:09:18.711631 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.711566 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-62z2n\"" Apr 22 15:09:18.711631 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.711615 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:09:18.731756 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731738 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk5wh\" (UniqueName: \"kubernetes.io/projected/b464f475-8761-403b-bc74-03f05e7d52c5-kube-api-access-bk5wh\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:18.731859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731765 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57d3cf97-71a9-44b9-80ac-439b558c8604-tmp-dir\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.731859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731803 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2htr\" (UniqueName: \"kubernetes.io/projected/57d3cf97-71a9-44b9-80ac-439b558c8604-kube-api-access-b2htr\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.731859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731826 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:18.731859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731851 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.732048 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.731931 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d3cf97-71a9-44b9-80ac-439b558c8604-config-volume\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.732048 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:18.732013 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:18.732048 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:18.732024 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:18.732183 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:18.732086 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:19.232062325 +0000 UTC m=+33.091943630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:18.732183 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:18.732106 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:19.232096755 +0000 UTC m=+33.091978052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:18.739754 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.739728 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57d3cf97-71a9-44b9-80ac-439b558c8604-tmp-dir\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.739934 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.739918 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d3cf97-71a9-44b9-80ac-439b558c8604-config-volume\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.747048 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.747029 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2htr\" (UniqueName: \"kubernetes.io/projected/57d3cf97-71a9-44b9-80ac-439b558c8604-kube-api-access-b2htr\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:18.747202 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:18.747180 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk5wh\" (UniqueName: \"kubernetes.io/projected/b464f475-8761-403b-bc74-03f05e7d52c5-kube-api-access-bk5wh\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:19.236517 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.236466 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:19.236701 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.236530 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:19.236701 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.236626 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:19.236701 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.236657 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:19.236701 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.236702 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:20.236684131 +0000 UTC m=+34.096565425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:19.236934 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.236717 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:20.23671095 +0000 UTC m=+34.096592243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:19.438545 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.438504 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:19.439013 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.438686 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:09:19.439013 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:19.438776 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:09:51.438750961 +0000 UTC m=+65.298632257 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : secret "metrics-daemon-secret" not found Apr 22 15:09:19.539180 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.539093 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:19.541824 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.541801 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkzp9\" (UniqueName: \"kubernetes.io/projected/7fff4b63-00a5-4c92-8135-75133481b228-kube-api-access-jkzp9\") pod \"network-check-target-w22tj\" (UID: \"7fff4b63-00a5-4c92-8135-75133481b228\") " pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:19.619377 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.619344 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:19.902995 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:19.902774 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w22tj"] Apr 22 15:09:19.990051 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:09:19.990020 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fff4b63_00a5_4c92_8135_75133481b228.slice/crio-7357201eb525f7a7f273ecc30cdb94cf0cf493667848c4ce97370ee14a6235aa WatchSource:0}: Error finding container 7357201eb525f7a7f273ecc30cdb94cf0cf493667848c4ce97370ee14a6235aa: Status 404 returned error can't find the container with id 7357201eb525f7a7f273ecc30cdb94cf0cf493667848c4ce97370ee14a6235aa Apr 22 15:09:20.243827 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:20.243789 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:20.243827 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:20.243824 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:20.244069 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:20.243936 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:20.244069 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:20.243988 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.243975022 +0000 UTC m=+36.103856315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:20.244069 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:20.243936 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:20.244069 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:20.244049 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.244037045 +0000 UTC m=+36.103918343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:20.945417 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:20.945377 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w22tj" event={"ID":"7fff4b63-00a5-4c92-8135-75133481b228","Type":"ContainerStarted","Data":"7357201eb525f7a7f273ecc30cdb94cf0cf493667848c4ce97370ee14a6235aa"} Apr 22 15:09:20.948248 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:20.948214 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="2a4d02531a26cfd9d11cb95e8c86331ae3ac1ce295d21197f9945225ac12440f" exitCode=0 Apr 22 15:09:20.948372 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:20.948262 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"2a4d02531a26cfd9d11cb95e8c86331ae3ac1ce295d21197f9945225ac12440f"} Apr 22 15:09:21.954296 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:21.954116 2559 generic.go:358] "Generic (PLEG): container finished" podID="89d42380-0740-498a-b66f-bfbe3da83afe" containerID="d43aa1a75ab3a24593d1796847ca5f7a0173f8f5f2efea3e9e3944ddbccfe0f3" exitCode=0 Apr 22 15:09:21.954746 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:21.954195 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerDied","Data":"d43aa1a75ab3a24593d1796847ca5f7a0173f8f5f2efea3e9e3944ddbccfe0f3"} Apr 22 15:09:22.259980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:22.259903 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:22.259980 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:22.259941 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:22.260203 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:22.260045 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:22.260203 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:22.260086 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:22.260203 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:22.260143 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:26.260127912 +0000 UTC m=+40.120009205 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:22.260203 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:22.260158 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:26.26015075 +0000 UTC m=+40.120032042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:22.958914 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:22.958838 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2scbl" event={"ID":"89d42380-0740-498a-b66f-bfbe3da83afe","Type":"ContainerStarted","Data":"580bcd808f12f95317eea171aefa2df3179e147c2f0778cc0f489c2462c108b3"} Apr 22 15:09:22.985677 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:22.985624 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2scbl" podStartSLOduration=4.948717398 podStartE2EDuration="36.985611163s" podCreationTimestamp="2026-04-22 15:08:46 +0000 UTC" firstStartedPulling="2026-04-22 15:08:47.983243666 +0000 UTC m=+1.843124958" lastFinishedPulling="2026-04-22 15:09:20.020137428 +0000 UTC m=+33.880018723" observedRunningTime="2026-04-22 15:09:22.983955944 +0000 UTC m=+36.843837259" watchObservedRunningTime="2026-04-22 15:09:22.985611163 +0000 UTC m=+36.845492478" Apr 22 15:09:23.961880 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:23.961845 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w22tj" event={"ID":"7fff4b63-00a5-4c92-8135-75133481b228","Type":"ContainerStarted","Data":"1334ceaec6e8b0ddadbc0cf406aa668aa4af0183ad1c127ee62b5da22429824b"} Apr 22 15:09:23.962325 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:23.961993 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:09:23.977691 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:23.977645 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w22tj" podStartSLOduration=33.958730118 podStartE2EDuration="36.977634767s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="2026-04-22 15:09:19.996523674 +0000 UTC m=+33.856404967" lastFinishedPulling="2026-04-22 15:09:23.01542832 +0000 UTC m=+36.875309616" observedRunningTime="2026-04-22 15:09:23.976868769 +0000 UTC m=+37.836750084" watchObservedRunningTime="2026-04-22 15:09:23.977634767 +0000 UTC m=+37.837516074" Apr 22 15:09:26.287658 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:26.287612 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:26.287658 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:26.287657 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:26.288150 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:26.287760 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:26.288150 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:26.287764 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:26.288150 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:26.287814 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:34.287799262 +0000 UTC m=+48.147680555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:26.288150 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:26.287828 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:34.287822201 +0000 UTC m=+48.147703494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:34.342996 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:34.342954 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:34.342996 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:34.342996 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:34.343425 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:34.343101 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:34.343425 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:34.343102 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:34.343425 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:34.343164 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:50.343149387 +0000 UTC m=+64.203030679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:34.343425 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:34.343177 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:50.343171306 +0000 UTC m=+64.203052598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:44.943319 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:44.943283 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4bfdn" Apr 22 15:09:50.354508 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:50.354446 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:09:50.354508 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:50.354513 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:09:50.354965 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:50.354600 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:50.354965 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:50.354655 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:50.354965 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:50.354663 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:22.354647737 +0000 UTC m=+96.214529031 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:09:50.354965 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:50.354717 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:22.354701567 +0000 UTC m=+96.214582869 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:09:51.462129 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:51.462082 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:09:51.462594 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:51.462229 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:09:51.462594 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:09:51.462309 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.462291458 +0000 UTC m=+129.322172756 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : secret "metrics-daemon-secret" not found Apr 22 15:09:54.966860 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:09:54.966823 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w22tj" Apr 22 15:10:22.372884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:22.372768 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:10:22.372884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:22.372806 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:10:22.373398 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:22.372910 2559 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:10:22.373398 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:22.372912 2559 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:10:22.373398 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:22.372961 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls podName:57d3cf97-71a9-44b9-80ac-439b558c8604 nodeName:}" failed. No retries permitted until 2026-04-22 15:11:26.372949065 +0000 UTC m=+160.232830357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls") pod "dns-default-cmsst" (UID: "57d3cf97-71a9-44b9-80ac-439b558c8604") : secret "dns-default-metrics-tls" not found Apr 22 15:10:22.373398 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:22.372975 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert podName:b464f475-8761-403b-bc74-03f05e7d52c5 nodeName:}" failed. No retries permitted until 2026-04-22 15:11:26.372969248 +0000 UTC m=+160.232850541 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert") pod "ingress-canary-ffzqh" (UID: "b464f475-8761-403b-bc74-03f05e7d52c5") : secret "canary-serving-cert" not found Apr 22 15:10:39.667571 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.667539 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5"] Apr 22 15:10:39.670243 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.670226 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-589c889464-99f7x"] Apr 22 15:10:39.670396 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.670376 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.672693 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.672674 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.672803 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.672726 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 15:10:39.673144 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.673128 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.673280 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.673258 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.673366 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.673312 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-nhp45\"" Apr 22 15:10:39.675478 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.675458 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 15:10:39.675693 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.675677 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.675749 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.675688 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 15:10:39.675749 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.675677 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 15:10:39.676004 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.675989 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w28zp\"" Apr 22 15:10:39.676044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.676032 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.680356 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.680340 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 15:10:39.683422 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.683404 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5"] Apr 22 15:10:39.684437 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.684416 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-589c889464-99f7x"] Apr 22 15:10:39.785753 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785720 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-stats-auth\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.785753 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785754 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xhv\" (UniqueName: \"kubernetes.io/projected/b6892fe2-bf35-4568-aa5b-cb281e67c570-kube-api-access-h9xhv\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.785947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785781 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nvs\" (UniqueName: \"kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.785947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785797 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.785947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785883 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.785947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785922 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-default-certificate\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.785947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.785937 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886335 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886290 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.886543 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886377 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-default-certificate\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886543 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:39.886384 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:10:39.886543 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886395 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886543 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:39.886472 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:40.386450387 +0000 UTC m=+114.246331681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : secret "samples-operator-tls" not found Apr 22 15:10:39.886543 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:39.886543 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:40.386528817 +0000 UTC m=+114.246410136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : configmap references non-existent config key: service-ca.crt Apr 22 15:10:39.886763 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886572 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-stats-auth\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886763 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886590 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xhv\" (UniqueName: \"kubernetes.io/projected/b6892fe2-bf35-4568-aa5b-cb281e67c570-kube-api-access-h9xhv\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886763 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886615 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86nvs\" (UniqueName: \"kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.886763 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.886708 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.886909 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:39.886838 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:10:39.886909 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:39.886900 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:40.386883988 +0000 UTC m=+114.246765281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : secret "router-metrics-certs-default" not found Apr 22 15:10:39.890248 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.890228 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-default-certificate\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.890324 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.890305 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-stats-auth\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.894729 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.894705 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk"] Apr 22 15:10:39.897691 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.897677 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:39.901071 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.901048 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 15:10:39.901168 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.901059 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.901226 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.901167 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 15:10:39.901373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.901358 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.901373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.901368 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9xkx8\"" Apr 22 15:10:39.905337 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.905311 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nvs\" (UniqueName: \"kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:39.905572 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.905553 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xhv\" (UniqueName: \"kubernetes.io/projected/b6892fe2-bf35-4568-aa5b-cb281e67c570-kube-api-access-h9xhv\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:39.912780 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.912761 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk"] Apr 22 15:10:39.987037 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.987012 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwbv\" (UniqueName: \"kubernetes.io/projected/ef01d307-3481-4c5f-97e9-b2c215eab28d-kube-api-access-vmwbv\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:39.987150 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.987090 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef01d307-3481-4c5f-97e9-b2c215eab28d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:39.987150 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:39.987122 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef01d307-3481-4c5f-97e9-b2c215eab28d-config\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.088041 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.088010 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef01d307-3481-4c5f-97e9-b2c215eab28d-config\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.088187 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.088050 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwbv\" (UniqueName: \"kubernetes.io/projected/ef01d307-3481-4c5f-97e9-b2c215eab28d-kube-api-access-vmwbv\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.088261 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.088237 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef01d307-3481-4c5f-97e9-b2c215eab28d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.088568 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.088552 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef01d307-3481-4c5f-97e9-b2c215eab28d-config\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.090253 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.090228 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef01d307-3481-4c5f-97e9-b2c215eab28d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.095647 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.095628 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwbv\" (UniqueName: \"kubernetes.io/projected/ef01d307-3481-4c5f-97e9-b2c215eab28d-kube-api-access-vmwbv\") pod \"service-ca-operator-d6fc45fc5-f2jrk\" (UID: \"ef01d307-3481-4c5f-97e9-b2c215eab28d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.206330 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.206300 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" Apr 22 15:10:40.316649 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.316615 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk"] Apr 22 15:10:40.320556 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:10:40.320520 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef01d307_3481_4c5f_97e9_b2c215eab28d.slice/crio-666baeda161a14e0b0eaeb727b6567b245001259d879868e38f90c4a952a0564 WatchSource:0}: Error finding container 666baeda161a14e0b0eaeb727b6567b245001259d879868e38f90c4a952a0564: Status 404 returned error can't find the container with id 666baeda161a14e0b0eaeb727b6567b245001259d879868e38f90c4a952a0564 Apr 22 15:10:40.390136 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.390111 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:40.390247 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.390150 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:40.390247 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:40.390187 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:40.390351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:40.390269 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:10:40.390351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:40.390268 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:10:40.390351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:40.390315 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:41.39030229 +0000 UTC m=+115.250183583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : secret "router-metrics-certs-default" not found Apr 22 15:10:40.390351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:40.390327 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:41.390321434 +0000 UTC m=+115.250202726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : secret "samples-operator-tls" not found Apr 22 15:10:40.390351 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:40.390339 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:41.390333647 +0000 UTC m=+115.250214940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : configmap references non-existent config key: service-ca.crt Apr 22 15:10:41.112219 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:41.112185 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" event={"ID":"ef01d307-3481-4c5f-97e9-b2c215eab28d","Type":"ContainerStarted","Data":"666baeda161a14e0b0eaeb727b6567b245001259d879868e38f90c4a952a0564"} Apr 22 15:10:41.398711 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:41.398624 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:41.398866 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:41.398691 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:41.398866 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:41.398756 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:41.398866 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:41.398790 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:10:41.398866 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:41.398853 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:43.398837245 +0000 UTC m=+117.258718548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : secret "samples-operator-tls" not found Apr 22 15:10:41.398866 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:41.398867 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:10:41.399125 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:41.398870 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:43.398861419 +0000 UTC m=+117.258742711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : configmap references non-existent config key: service-ca.crt Apr 22 15:10:41.399125 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:41.398920 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:43.398906119 +0000 UTC m=+117.258787422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : secret "router-metrics-certs-default" not found Apr 22 15:10:43.117136 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:43.117100 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" event={"ID":"ef01d307-3481-4c5f-97e9-b2c215eab28d","Type":"ContainerStarted","Data":"7eb99fead64cc12377e8f3a027f136278f4735a5d71f5fb6f455079b3db68d4d"} Apr 22 15:10:43.132608 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:43.132565 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" podStartSLOduration=1.947940847 podStartE2EDuration="4.132551083s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:40.322194856 +0000 UTC m=+114.182076150" lastFinishedPulling="2026-04-22 15:10:42.506805088 +0000 UTC m=+116.366686386" observedRunningTime="2026-04-22 15:10:43.131685954 +0000 UTC m=+116.991567279" watchObservedRunningTime="2026-04-22 15:10:43.132551083 +0000 UTC m=+116.992432398" Apr 22 15:10:43.412662 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:43.412586 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:43.412662 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:43.412642 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:43.412667 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:43.412719 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:43.412766 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.412754191 +0000 UTC m=+121.272635483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : configmap references non-existent config key: service-ca.crt Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:43.412782 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.412775608 +0000 UTC m=+121.272656900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : secret "router-metrics-certs-default" not found Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:43.412804 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:10:43.412873 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:43.412863 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.412845086 +0000 UTC m=+121.272726393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : secret "samples-operator-tls" not found Apr 22 15:10:45.893212 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.893181 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fj94h"] Apr 22 15:10:45.896075 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.896060 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:45.898604 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.898581 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 15:10:45.898698 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.898582 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 15:10:45.899678 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.899661 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-2xprs\"" Apr 22 15:10:45.899757 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.899661 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 15:10:45.899757 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.899721 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 15:10:45.904374 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:45.904352 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fj94h"] Apr 22 15:10:46.032252 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.032204 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsnwz\" (UniqueName: \"kubernetes.io/projected/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-kube-api-access-vsnwz\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.032252 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.032257 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-key\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.032434 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.032274 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-cabundle\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.119655 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.119624 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp"] Apr 22 15:10:46.122412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.122394 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" Apr 22 15:10:46.125196 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.125180 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:46.125314 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.125297 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 15:10:46.125547 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.125533 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-7kr6j\"" Apr 22 15:10:46.133113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.133096 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnwz\" (UniqueName: \"kubernetes.io/projected/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-kube-api-access-vsnwz\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.133199 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.133124 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-key\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.133199 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.133141 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-cabundle\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.133673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.133657 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-cabundle\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.135758 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.135741 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-signing-key\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.135924 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.135904 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp"] Apr 22 15:10:46.145910 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.145863 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsnwz\" (UniqueName: \"kubernetes.io/projected/9d6755d5-69da-41e1-bd9e-6404d0dd4d8f-kube-api-access-vsnwz\") pod \"service-ca-865cb79987-fj94h\" (UID: \"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f\") " pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.205574 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.205541 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fj94h" Apr 22 15:10:46.234137 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.234112 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77dnp\" (UniqueName: \"kubernetes.io/projected/e9b83bb0-5336-43e1-8a72-f561808f0834-kube-api-access-77dnp\") pod \"migrator-74bb7799d9-ngnjp\" (UID: \"e9b83bb0-5336-43e1-8a72-f561808f0834\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" Apr 22 15:10:46.315411 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.315378 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fj94h"] Apr 22 15:10:46.319404 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:10:46.319375 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d6755d5_69da_41e1_bd9e_6404d0dd4d8f.slice/crio-3269abe0bea3eb0ef749e48778d8b56797cc4432f6e62a2646144ac6db346b20 WatchSource:0}: Error finding container 3269abe0bea3eb0ef749e48778d8b56797cc4432f6e62a2646144ac6db346b20: Status 404 returned error can't find the container with id 3269abe0bea3eb0ef749e48778d8b56797cc4432f6e62a2646144ac6db346b20 Apr 22 15:10:46.335036 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.335016 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77dnp\" (UniqueName: \"kubernetes.io/projected/e9b83bb0-5336-43e1-8a72-f561808f0834-kube-api-access-77dnp\") pod \"migrator-74bb7799d9-ngnjp\" (UID: \"e9b83bb0-5336-43e1-8a72-f561808f0834\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" Apr 22 15:10:46.342013 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.341992 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77dnp\" (UniqueName: \"kubernetes.io/projected/e9b83bb0-5336-43e1-8a72-f561808f0834-kube-api-access-77dnp\") pod \"migrator-74bb7799d9-ngnjp\" (UID: \"e9b83bb0-5336-43e1-8a72-f561808f0834\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" Apr 22 15:10:46.431625 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.431545 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" Apr 22 15:10:46.542119 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.542086 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp"] Apr 22 15:10:46.544661 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:10:46.544633 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b83bb0_5336_43e1_8a72_f561808f0834.slice/crio-2c9a88589c25fecf573ef3fb37d5ef9c429f627066cf87944f4be52295413960 WatchSource:0}: Error finding container 2c9a88589c25fecf573ef3fb37d5ef9c429f627066cf87944f4be52295413960: Status 404 returned error can't find the container with id 2c9a88589c25fecf573ef3fb37d5ef9c429f627066cf87944f4be52295413960 Apr 22 15:10:46.980714 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:46.980682 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p6k5v_d54da891-e22c-4f12-9dfb-e2ada408f1ea/dns-node-resolver/0.log" Apr 22 15:10:47.125960 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.125909 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" event={"ID":"e9b83bb0-5336-43e1-8a72-f561808f0834","Type":"ContainerStarted","Data":"2c9a88589c25fecf573ef3fb37d5ef9c429f627066cf87944f4be52295413960"} Apr 22 15:10:47.128967 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.128933 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fj94h" event={"ID":"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f","Type":"ContainerStarted","Data":"f8424bd10685a56f8bb25ed0d1e99d0c2e7debacd5636ad7940304f0e9844f70"} Apr 22 15:10:47.129107 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.128973 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fj94h" event={"ID":"9d6755d5-69da-41e1-bd9e-6404d0dd4d8f","Type":"ContainerStarted","Data":"3269abe0bea3eb0ef749e48778d8b56797cc4432f6e62a2646144ac6db346b20"} Apr 22 15:10:47.167275 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.167230 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fj94h" podStartSLOduration=2.167215852 podStartE2EDuration="2.167215852s" podCreationTimestamp="2026-04-22 15:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:10:47.167014573 +0000 UTC m=+121.026895887" watchObservedRunningTime="2026-04-22 15:10:47.167215852 +0000 UTC m=+121.027097144" Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.446069 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.446161 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.446210 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:47.446377 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.446357917 +0000 UTC m=+129.306239235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : configmap references non-existent config key: service-ca.crt Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:47.446830 2559 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:47.446882 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs podName:b6892fe2-bf35-4568-aa5b-cb281e67c570 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.446865236 +0000 UTC m=+129.306746537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs") pod "router-default-589c889464-99f7x" (UID: "b6892fe2-bf35-4568-aa5b-cb281e67c570") : secret "router-metrics-certs-default" not found Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:47.446936 2559 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 15:10:47.447003 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:47.446972 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.446959695 +0000 UTC m=+129.306840993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : secret "samples-operator-tls" not found Apr 22 15:10:47.789951 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:47.789804 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tqshj_69f41c04-4f2b-476d-aac0-a60437dfe0c5/node-ca/0.log" Apr 22 15:10:48.132292 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:48.132252 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" event={"ID":"e9b83bb0-5336-43e1-8a72-f561808f0834","Type":"ContainerStarted","Data":"c8f8f98fbdefe05a86fd8d4a545d20ab48a4cd17c657e9dc521ca350982b1ef3"} Apr 22 15:10:48.132292 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:48.132295 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" event={"ID":"e9b83bb0-5336-43e1-8a72-f561808f0834","Type":"ContainerStarted","Data":"cd63fdc03062bf6a3366a46289d29e022af623551d2eb0a6f3118eed26c26d55"} Apr 22 15:10:48.151173 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:48.151130 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ngnjp" podStartSLOduration=0.772659242 podStartE2EDuration="2.151115724s" podCreationTimestamp="2026-04-22 15:10:46 +0000 UTC" firstStartedPulling="2026-04-22 15:10:46.546336111 +0000 UTC m=+120.406217407" lastFinishedPulling="2026-04-22 15:10:47.92479259 +0000 UTC m=+121.784673889" observedRunningTime="2026-04-22 15:10:48.150126483 +0000 UTC m=+122.010007811" watchObservedRunningTime="2026-04-22 15:10:48.151115724 +0000 UTC m=+122.010997039" Apr 22 15:10:55.506865 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.506825 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:10:55.507397 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.506875 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:55.507397 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.506920 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:55.507397 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.506981 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:55.507397 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:55.506985 2559 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:10:55.507397 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:10:55.507053 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs podName:a5671169-29ab-4157-acb0-207c8e83501a nodeName:}" failed. No retries permitted until 2026-04-22 15:12:57.507034085 +0000 UTC m=+251.366915392 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs") pod "network-metrics-daemon-8hzhw" (UID: "a5671169-29ab-4157-acb0-207c8e83501a") : secret "metrics-daemon-secret" not found Apr 22 15:10:55.507696 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.507672 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6892fe2-bf35-4568-aa5b-cb281e67c570-service-ca-bundle\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:55.509262 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.509240 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qm2z5\" (UID: \"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:55.509362 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.509269 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b6892fe2-bf35-4568-aa5b-cb281e67c570-metrics-certs\") pod \"router-default-589c889464-99f7x\" (UID: \"b6892fe2-bf35-4568-aa5b-cb281e67c570\") " pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:55.582827 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.582795 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-nhp45\"" Apr 22 15:10:55.588066 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.588044 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-w28zp\"" Apr 22 15:10:55.591030 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.591013 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" Apr 22 15:10:55.596751 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.596729 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:55.739854 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.739824 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5"] Apr 22 15:10:55.768605 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:55.768579 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-589c889464-99f7x"] Apr 22 15:10:55.771587 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:10:55.771557 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6892fe2_bf35_4568_aa5b_cb281e67c570.slice/crio-8a73bee6ca6900fb5f04d7917bd123ea49500bc363e7fcd9d12c3f44295799e7 WatchSource:0}: Error finding container 8a73bee6ca6900fb5f04d7917bd123ea49500bc363e7fcd9d12c3f44295799e7: Status 404 returned error can't find the container with id 8a73bee6ca6900fb5f04d7917bd123ea49500bc363e7fcd9d12c3f44295799e7 Apr 22 15:10:56.148774 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.148677 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" event={"ID":"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2","Type":"ContainerStarted","Data":"c2422eef0c64f9a7c4d3b0beea72e3b5089f387eb5b997b376eaf3c0148bb12d"} Apr 22 15:10:56.149822 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.149801 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-589c889464-99f7x" event={"ID":"b6892fe2-bf35-4568-aa5b-cb281e67c570","Type":"ContainerStarted","Data":"7c6cbb32269e75a894155e715acaf6fda23ea43a14d84264eb08b6bab1791be3"} Apr 22 15:10:56.149934 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.149826 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-589c889464-99f7x" event={"ID":"b6892fe2-bf35-4568-aa5b-cb281e67c570","Type":"ContainerStarted","Data":"8a73bee6ca6900fb5f04d7917bd123ea49500bc363e7fcd9d12c3f44295799e7"} Apr 22 15:10:56.170315 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.170269 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-589c889464-99f7x" podStartSLOduration=17.170251637 podStartE2EDuration="17.170251637s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:10:56.169286117 +0000 UTC m=+130.029167443" watchObservedRunningTime="2026-04-22 15:10:56.170251637 +0000 UTC m=+130.030132952" Apr 22 15:10:56.596963 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.596920 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:56.599929 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:56.599900 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:57.153655 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:57.153597 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:57.154856 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:57.154830 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-589c889464-99f7x" Apr 22 15:10:58.156874 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:58.156835 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" event={"ID":"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2","Type":"ContainerStarted","Data":"f48fc854dd0e727ce7d675b68ec849ddecf47ff0231edd663c87a88d0cc2fff2"} Apr 22 15:10:58.157288 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:58.156879 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" event={"ID":"f963f6da-e36a-4b61-ade2-ceb2efdb3eb2","Type":"ContainerStarted","Data":"2ac4ae25a655a7cfb87800d027ebb4f5b379a26b29b797457e6e0bb1b004d5a1"} Apr 22 15:10:58.173779 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:10:58.173736 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5" podStartSLOduration=17.355755649 podStartE2EDuration="19.173723485s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:55.781057027 +0000 UTC m=+129.640938320" lastFinishedPulling="2026-04-22 15:10:57.599024861 +0000 UTC m=+131.458906156" observedRunningTime="2026-04-22 15:10:58.173151199 +0000 UTC m=+132.033032514" watchObservedRunningTime="2026-04-22 15:10:58.173723485 +0000 UTC m=+132.033604799" Apr 22 15:11:21.515504 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:11:21.515455 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cmsst" podUID="57d3cf97-71a9-44b9-80ac-439b558c8604" Apr 22 15:11:21.525613 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:11:21.525591 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ffzqh" podUID="b464f475-8761-403b-bc74-03f05e7d52c5" Apr 22 15:11:21.724387 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:11:21.724340 2559 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8hzhw" podUID="a5671169-29ab-4157-acb0-207c8e83501a" Apr 22 15:11:22.211927 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:22.211899 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:26.426969 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.426935 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:11:26.426969 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.426972 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:26.429318 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.429299 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57d3cf97-71a9-44b9-80ac-439b558c8604-metrics-tls\") pod \"dns-default-cmsst\" (UID: \"57d3cf97-71a9-44b9-80ac-439b558c8604\") " pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:26.429453 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.429433 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b464f475-8761-403b-bc74-03f05e7d52c5-cert\") pod \"ingress-canary-ffzqh\" (UID: \"b464f475-8761-403b-bc74-03f05e7d52c5\") " pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:11:26.715832 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.715806 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkblf\"" Apr 22 15:11:26.723913 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.723895 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:26.835479 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:26.835447 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cmsst"] Apr 22 15:11:26.838577 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:11:26.838550 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d3cf97_71a9_44b9_80ac_439b558c8604.slice/crio-3205c8787b356af90f3b590695ce2033c077216a622070f925684a88b8d4a3a8 WatchSource:0}: Error finding container 3205c8787b356af90f3b590695ce2033c077216a622070f925684a88b8d4a3a8: Status 404 returned error can't find the container with id 3205c8787b356af90f3b590695ce2033c077216a622070f925684a88b8d4a3a8 Apr 22 15:11:27.224123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:27.224089 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmsst" event={"ID":"57d3cf97-71a9-44b9-80ac-439b558c8604","Type":"ContainerStarted","Data":"3205c8787b356af90f3b590695ce2033c077216a622070f925684a88b8d4a3a8"} Apr 22 15:11:29.229670 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:29.229630 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmsst" event={"ID":"57d3cf97-71a9-44b9-80ac-439b558c8604","Type":"ContainerStarted","Data":"b876a351d8e64584004b5c8de424cbd39f075579206d1633220e2cbfc624323b"} Apr 22 15:11:29.229670 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:29.229667 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cmsst" event={"ID":"57d3cf97-71a9-44b9-80ac-439b558c8604","Type":"ContainerStarted","Data":"440f9cd868c94276fe89ed064def11f717b3d036d0b971ea127e34de182370cd"} Apr 22 15:11:29.230068 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:29.229758 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:29.247568 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:29.247525 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cmsst" podStartSLOduration=129.926308145 podStartE2EDuration="2m11.247512789s" podCreationTimestamp="2026-04-22 15:09:18 +0000 UTC" firstStartedPulling="2026-04-22 15:11:26.840198178 +0000 UTC m=+160.700079471" lastFinishedPulling="2026-04-22 15:11:28.161402819 +0000 UTC m=+162.021284115" observedRunningTime="2026-04-22 15:11:29.246340552 +0000 UTC m=+163.106221878" watchObservedRunningTime="2026-04-22 15:11:29.247512789 +0000 UTC m=+163.107394103" Apr 22 15:11:33.708633 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:33.708549 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:11:34.708408 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:34.708368 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:11:34.711058 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:34.711037 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cbqpc\"" Apr 22 15:11:34.719309 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:34.719291 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffzqh" Apr 22 15:11:34.829160 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:34.829129 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffzqh"] Apr 22 15:11:34.831853 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:11:34.831824 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb464f475_8761_403b_bc74_03f05e7d52c5.slice/crio-b30af71db56025caa98edf5becdaa28ec8ee2901dc5c3386fea8465db99dde41 WatchSource:0}: Error finding container b30af71db56025caa98edf5becdaa28ec8ee2901dc5c3386fea8465db99dde41: Status 404 returned error can't find the container with id b30af71db56025caa98edf5becdaa28ec8ee2901dc5c3386fea8465db99dde41 Apr 22 15:11:35.247637 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:35.247575 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffzqh" event={"ID":"b464f475-8761-403b-bc74-03f05e7d52c5","Type":"ContainerStarted","Data":"b30af71db56025caa98edf5becdaa28ec8ee2901dc5c3386fea8465db99dde41"} Apr 22 15:11:37.253927 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:37.253891 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffzqh" event={"ID":"b464f475-8761-403b-bc74-03f05e7d52c5","Type":"ContainerStarted","Data":"350222a29940122cf7e9b56bb1e82ed3437ade7b52f54f36acbd09f4d52928c9"} Apr 22 15:11:39.234336 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:39.234303 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cmsst" Apr 22 15:11:39.256257 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:39.256211 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ffzqh" podStartSLOduration=139.826454626 podStartE2EDuration="2m21.256197646s" podCreationTimestamp="2026-04-22 15:09:18 +0000 UTC" firstStartedPulling="2026-04-22 15:11:34.834099819 +0000 UTC m=+168.693981111" lastFinishedPulling="2026-04-22 15:11:36.263842836 +0000 UTC m=+170.123724131" observedRunningTime="2026-04-22 15:11:37.280183175 +0000 UTC m=+171.140064492" watchObservedRunningTime="2026-04-22 15:11:39.256197646 +0000 UTC m=+173.116078961" Apr 22 15:11:58.305656 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:58.305616 2559 generic.go:358] "Generic (PLEG): container finished" podID="ef01d307-3481-4c5f-97e9-b2c215eab28d" containerID="7eb99fead64cc12377e8f3a027f136278f4735a5d71f5fb6f455079b3db68d4d" exitCode=0 Apr 22 15:11:58.306057 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:58.305684 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" event={"ID":"ef01d307-3481-4c5f-97e9-b2c215eab28d","Type":"ContainerDied","Data":"7eb99fead64cc12377e8f3a027f136278f4735a5d71f5fb6f455079b3db68d4d"} Apr 22 15:11:58.306057 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:58.305958 2559 scope.go:117] "RemoveContainer" containerID="7eb99fead64cc12377e8f3a027f136278f4735a5d71f5fb6f455079b3db68d4d" Apr 22 15:11:59.309717 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:11:59.309682 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-f2jrk" event={"ID":"ef01d307-3481-4c5f-97e9-b2c215eab28d","Type":"ContainerStarted","Data":"f22f705abaddaaa51338380d86bdee44f4db91e17223e382aa4d6d20970c1f0d"} Apr 22 15:12:13.351253 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:13.351217 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmsst_57d3cf97-71a9-44b9-80ac-439b558c8604/dns/0.log" Apr 22 15:12:13.551466 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:13.551440 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmsst_57d3cf97-71a9-44b9-80ac-439b558c8604/kube-rbac-proxy/0.log" Apr 22 15:12:14.550974 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:14.550949 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p6k5v_d54da891-e22c-4f12-9dfb-e2ada408f1ea/dns-node-resolver/0.log" Apr 22 15:12:15.751091 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:15.751061 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tqshj_69f41c04-4f2b-476d-aac0-a60437dfe0c5/node-ca/0.log" Apr 22 15:12:16.151885 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:16.151764 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-589c889464-99f7x_b6892fe2-bf35-4568-aa5b-cb281e67c570/router/0.log" Apr 22 15:12:16.551672 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:16.551646 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ffzqh_b464f475-8761-403b-bc74-03f05e7d52c5/serve-healthcheck-canary/0.log" Apr 22 15:12:16.951500 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:16.951457 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ngnjp_e9b83bb0-5336-43e1-8a72-f561808f0834/migrator/0.log" Apr 22 15:12:17.151554 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:17.151523 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ngnjp_e9b83bb0-5336-43e1-8a72-f561808f0834/graceful-termination/0.log" Apr 22 15:12:57.538743 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:57.538706 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:12:57.541020 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:57.540996 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5671169-29ab-4157-acb0-207c8e83501a-metrics-certs\") pod \"network-metrics-daemon-8hzhw\" (UID: \"a5671169-29ab-4157-acb0-207c8e83501a\") " pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:12:57.711617 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:57.711582 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-62z2n\"" Apr 22 15:12:57.719474 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:57.719453 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8hzhw" Apr 22 15:12:57.826711 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:57.826636 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8hzhw"] Apr 22 15:12:57.829437 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:12:57.829411 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5671169_29ab_4157_acb0_207c8e83501a.slice/crio-5deca0ce06bc77df910ed0e6b8820d7413f7480c439d93202a6ae0aff135bd0c WatchSource:0}: Error finding container 5deca0ce06bc77df910ed0e6b8820d7413f7480c439d93202a6ae0aff135bd0c: Status 404 returned error can't find the container with id 5deca0ce06bc77df910ed0e6b8820d7413f7480c439d93202a6ae0aff135bd0c Apr 22 15:12:58.453619 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:58.453577 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8hzhw" event={"ID":"a5671169-29ab-4157-acb0-207c8e83501a","Type":"ContainerStarted","Data":"5deca0ce06bc77df910ed0e6b8820d7413f7480c439d93202a6ae0aff135bd0c"} Apr 22 15:12:59.457285 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:59.457247 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8hzhw" event={"ID":"a5671169-29ab-4157-acb0-207c8e83501a","Type":"ContainerStarted","Data":"bd0d73256d5b74b312c5cc63328d9ef22f52a22d3f1ed896774bf6ae82712de1"} Apr 22 15:12:59.457285 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:59.457286 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8hzhw" event={"ID":"a5671169-29ab-4157-acb0-207c8e83501a","Type":"ContainerStarted","Data":"cb0607220a9400536ebacba06fe86e92b970c1576fd71f03fca3f59789dfd9ff"} Apr 22 15:12:59.473673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:12:59.473630 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8hzhw" podStartSLOduration=251.502414505 podStartE2EDuration="4m12.473617852s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="2026-04-22 15:12:57.831255805 +0000 UTC m=+251.691137106" lastFinishedPulling="2026-04-22 15:12:58.802459161 +0000 UTC m=+252.662340453" observedRunningTime="2026-04-22 15:12:59.471953875 +0000 UTC m=+253.331835190" watchObservedRunningTime="2026-04-22 15:12:59.473617852 +0000 UTC m=+253.333499167" Apr 22 15:13:46.598998 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:13:46.598966 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:13:46.599584 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:13:46.599363 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:16:58.763439 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:58.763402 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:16:58.763439 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:58.763436 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:16:58.763439 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:58.763447 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:16:58.763992 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:58.763520 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:16:59.263500841 +0000 UTC m=+493.123382149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:16:59.267319 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:59.267280 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:16:59.267319 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:59.267315 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:16:59.267319 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:59.267325 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:16:59.267572 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:16:59.267374 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:17:00.267359922 +0000 UTC m=+494.127241215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:00.273726 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:00.273691 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:17:00.273726 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:00.273719 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:17:00.273726 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:00.273729 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:00.274152 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:00.273776 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:17:02.273761949 +0000 UTC m=+496.133643242 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:02.287677 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:02.287628 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:17:02.287677 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:02.287672 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:17:02.287677 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:02.287683 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:02.288142 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:02.287732 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:17:06.287719093 +0000 UTC m=+500.147600386 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:06.313033 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:06.312996 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:17:06.313033 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:06.313024 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:17:06.313033 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:06.313038 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:06.313600 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:06.313099 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:17:14.313079047 +0000 UTC m=+508.172960342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:14.365350 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:14.365316 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:17:14.365350 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:14.365346 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:17:14.365350 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:14.365358 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:14.365809 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:14.365409 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:17:30.365391724 +0000 UTC m=+524.225273025 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:30.365862 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:30.365819 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:17:30.365862 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:30.365858 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:17:30.365862 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:30.365868 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:17:30.366408 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:17:30.365924 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:18:02.365910902 +0000 UTC m=+556.225792195 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:18:02.382383 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:02.382340 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 22 15:18:02.382383 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:02.382375 2559 projected.go:289] Couldn't get configMap openshift-cluster-samples-operator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 15:18:02.382383 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:02.382385 2559 projected.go:194] Error preparing data for projected volume kube-api-access-86nvs for pod openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qm2z5: [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:18:02.382939 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:02.382443 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs podName:f963f6da-e36a-4b61-ade2-ceb2efdb3eb2 nodeName:}" failed. No retries permitted until 2026-04-22 15:19:06.382429392 +0000 UTC m=+620.242310684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-86nvs" (UniqueName: "kubernetes.io/projected/f963f6da-e36a-4b61-ade2-ceb2efdb3eb2-kube-api-access-86nvs") pod "cluster-samples-operator-6dc5bdb6b4-qm2z5" (UID: "f963f6da-e36a-4b61-ade2-ceb2efdb3eb2") : [configmap "kube-root-ca.crt" not found, configmap "openshift-service-ca.crt" not found] Apr 22 15:18:05.228469 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.228432 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-g58gj"] Apr 22 15:18:05.231295 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.231274 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.233855 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.233831 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:18:05.233971 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.233953 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:18:05.235011 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.234987 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:18:05.235011 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.235007 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-c6286\"" Apr 22 15:18:05.235181 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.235031 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:18:05.251255 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.251212 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g58gj"] Apr 22 15:18:05.300078 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.300049 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.300078 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.300084 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.300301 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.300106 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpq6s\" (UniqueName: \"kubernetes.io/projected/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-api-access-mpq6s\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.300301 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.300192 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-crio-socket\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.300301 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.300210 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-data-volume\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.322844 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.322816 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn"] Apr 22 15:18:05.324764 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.324750 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:05.330368 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.330350 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-sm8r6\"" Apr 22 15:18:05.330662 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.330645 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 15:18:05.343865 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.343845 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn"] Apr 22 15:18:05.400576 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400548 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-crio-socket\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400583 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-data-volume\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400615 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd601ac1-c12d-43f5-88ff-09b6c5652c26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdwnn\" (UID: \"dd601ac1-c12d-43f5-88ff-09b6c5652c26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400648 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-crio-socket\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400669 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400696 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.400736 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.400719 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpq6s\" (UniqueName: \"kubernetes.io/projected/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-api-access-mpq6s\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.401036 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.401015 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-data-volume\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.401240 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.401218 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.402913 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.402896 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.416998 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.416969 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f6dccbfd7-k4p6p"] Apr 22 15:18:05.419009 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.418995 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.422988 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.422959 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:18:05.423094 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.422994 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dblpk\"" Apr 22 15:18:05.423804 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.423785 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:18:05.423908 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.423818 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:18:05.429863 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.429845 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:18:05.433867 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.433849 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpq6s\" (UniqueName: \"kubernetes.io/projected/c5a2dfa4-9885-4f86-9d28-6d2f547d5123-kube-api-access-mpq6s\") pod \"insights-runtime-extractor-g58gj\" (UID: \"c5a2dfa4-9885-4f86-9d28-6d2f547d5123\") " pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.434760 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.434738 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f6dccbfd7-k4p6p"] Apr 22 15:18:05.501418 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501339 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-trusted-ca\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501418 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501406 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ee7cdf-2a19-48bb-ba79-f272ff00571f-ca-trust-extracted\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501435 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-image-registry-private-configuration\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501453 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8hz\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-kube-api-access-4c8hz\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501547 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-certificates\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501567 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-installation-pull-secrets\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501585 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-tls\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.501620 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501606 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd601ac1-c12d-43f5-88ff-09b6c5652c26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdwnn\" (UID: \"dd601ac1-c12d-43f5-88ff-09b6c5652c26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:05.501815 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.501626 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-bound-sa-token\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.503991 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.503969 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dd601ac1-c12d-43f5-88ff-09b6c5652c26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-mdwnn\" (UID: \"dd601ac1-c12d-43f5-88ff-09b6c5652c26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:05.540934 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.540911 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-g58gj" Apr 22 15:18:05.602735 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602706 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-tls\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.602884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602742 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-bound-sa-token\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.602884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602782 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-trusted-ca\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.602884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602850 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ee7cdf-2a19-48bb-ba79-f272ff00571f-ca-trust-extracted\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.603044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602889 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-image-registry-private-configuration\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.603044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602919 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8hz\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-kube-api-access-4c8hz\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.603044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602961 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-certificates\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.603044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.602986 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-installation-pull-secrets\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.603628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.603352 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/79ee7cdf-2a19-48bb-ba79-f272ff00571f-ca-trust-extracted\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.604044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.604016 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-certificates\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.604601 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.604419 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/79ee7cdf-2a19-48bb-ba79-f272ff00571f-trusted-ca\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.606104 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.606087 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-registry-tls\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.606865 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.606808 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-installation-pull-secrets\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.606865 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.606849 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/79ee7cdf-2a19-48bb-ba79-f272ff00571f-image-registry-private-configuration\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.618799 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.618773 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8hz\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-kube-api-access-4c8hz\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.619775 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.619754 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/79ee7cdf-2a19-48bb-ba79-f272ff00571f-bound-sa-token\") pod \"image-registry-f6dccbfd7-k4p6p\" (UID: \"79ee7cdf-2a19-48bb-ba79-f272ff00571f\") " pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.633661 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.633640 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:05.667231 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.666545 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-g58gj"] Apr 22 15:18:05.668653 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:05.668611 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5a2dfa4_9885_4f86_9d28_6d2f547d5123.slice/crio-8a085685e941a06498e86bfa36122f953288c7fc337a293f62a9bb8965f89725 WatchSource:0}: Error finding container 8a085685e941a06498e86bfa36122f953288c7fc337a293f62a9bb8965f89725: Status 404 returned error can't find the container with id 8a085685e941a06498e86bfa36122f953288c7fc337a293f62a9bb8965f89725 Apr 22 15:18:05.670573 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.670554 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:18:05.728054 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.728030 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:05.763409 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.763309 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn"] Apr 22 15:18:05.767206 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:05.767159 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd601ac1_c12d_43f5_88ff_09b6c5652c26.slice/crio-50aeae116c3c9351d242c4f20328ba1f2c94789382842e564870926906a72325 WatchSource:0}: Error finding container 50aeae116c3c9351d242c4f20328ba1f2c94789382842e564870926906a72325: Status 404 returned error can't find the container with id 50aeae116c3c9351d242c4f20328ba1f2c94789382842e564870926906a72325 Apr 22 15:18:05.853238 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:05.853210 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f6dccbfd7-k4p6p"] Apr 22 15:18:05.856086 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:05.856058 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ee7cdf_2a19_48bb_ba79_f272ff00571f.slice/crio-53aeb77ebedd4fbc41e8ae296ad4137d9b4bd1c221dac3f311fb065ca6636340 WatchSource:0}: Error finding container 53aeb77ebedd4fbc41e8ae296ad4137d9b4bd1c221dac3f311fb065ca6636340: Status 404 returned error can't find the container with id 53aeb77ebedd4fbc41e8ae296ad4137d9b4bd1c221dac3f311fb065ca6636340 Apr 22 15:18:06.179294 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.179205 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" event={"ID":"79ee7cdf-2a19-48bb-ba79-f272ff00571f","Type":"ContainerStarted","Data":"db19e20aac37419a18047c20de0ed004c5cf9684c6511165adadd378a6a18dc5"} Apr 22 15:18:06.179294 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.179255 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" event={"ID":"79ee7cdf-2a19-48bb-ba79-f272ff00571f","Type":"ContainerStarted","Data":"53aeb77ebedd4fbc41e8ae296ad4137d9b4bd1c221dac3f311fb065ca6636340"} Apr 22 15:18:06.179534 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.179355 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:06.180437 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.180404 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" event={"ID":"dd601ac1-c12d-43f5-88ff-09b6c5652c26","Type":"ContainerStarted","Data":"50aeae116c3c9351d242c4f20328ba1f2c94789382842e564870926906a72325"} Apr 22 15:18:06.181807 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.181780 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g58gj" event={"ID":"c5a2dfa4-9885-4f86-9d28-6d2f547d5123","Type":"ContainerStarted","Data":"0ed6c9c0d5433bb17b00c3e4fe798b15457007137cfa047df5f07b0389192bc5"} Apr 22 15:18:06.181907 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.181816 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g58gj" event={"ID":"c5a2dfa4-9885-4f86-9d28-6d2f547d5123","Type":"ContainerStarted","Data":"8a085685e941a06498e86bfa36122f953288c7fc337a293f62a9bb8965f89725"} Apr 22 15:18:06.199277 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:06.199228 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" podStartSLOduration=1.1992123399999999 podStartE2EDuration="1.19921234s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:18:06.19766426 +0000 UTC m=+560.057545587" watchObservedRunningTime="2026-04-22 15:18:06.19921234 +0000 UTC m=+560.059093652" Apr 22 15:18:07.185747 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:07.185713 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" event={"ID":"dd601ac1-c12d-43f5-88ff-09b6c5652c26","Type":"ContainerStarted","Data":"1046d798d5306e41fe1192c4048db24cab05245f6abc628492316d6e9087bce2"} Apr 22 15:18:07.186187 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:07.185953 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:07.187447 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:07.187417 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g58gj" event={"ID":"c5a2dfa4-9885-4f86-9d28-6d2f547d5123","Type":"ContainerStarted","Data":"f4af8aab55566afbdec6300c7323fabf28a14bd429a6e244cdbd8794100c363d"} Apr 22 15:18:07.191303 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:07.191279 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" Apr 22 15:18:07.200985 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:07.200939 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-mdwnn" podStartSLOduration=1.286101511 podStartE2EDuration="2.200923244s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.769087527 +0000 UTC m=+559.628968820" lastFinishedPulling="2026-04-22 15:18:06.683909256 +0000 UTC m=+560.543790553" observedRunningTime="2026-04-22 15:18:07.199743667 +0000 UTC m=+561.059624982" watchObservedRunningTime="2026-04-22 15:18:07.200923244 +0000 UTC m=+561.060804560" Apr 22 15:18:08.192567 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:08.192527 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-g58gj" event={"ID":"c5a2dfa4-9885-4f86-9d28-6d2f547d5123","Type":"ContainerStarted","Data":"934a6ffd93f06891e5f21a2f0bd396076b1e69d8e2c35216d2f8ea21aa152b84"} Apr 22 15:18:08.216619 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:08.216565 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-g58gj" podStartSLOduration=1.23831525 podStartE2EDuration="3.216552442s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.73240756 +0000 UTC m=+559.592288853" lastFinishedPulling="2026-04-22 15:18:07.710644748 +0000 UTC m=+561.570526045" observedRunningTime="2026-04-22 15:18:08.215157146 +0000 UTC m=+562.075038460" watchObservedRunningTime="2026-04-22 15:18:08.216552442 +0000 UTC m=+562.076433819" Apr 22 15:18:13.439211 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.439171 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-g982r"] Apr 22 15:18:13.442578 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.442548 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hbqsr"] Apr 22 15:18:13.442722 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.442701 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.444720 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.444704 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.446096 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.446074 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:18:13.446096 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.446086 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:18:13.446501 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.446463 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:18:13.446616 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.446468 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 15:18:13.447095 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.447059 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:18:13.447203 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.447066 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:18:13.447329 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.447308 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:18:13.448113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.448093 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8dvjq\"" Apr 22 15:18:13.448113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.448103 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-nlltv\"" Apr 22 15:18:13.448274 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.448161 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 15:18:13.448274 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.448204 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:18:13.455861 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.455843 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-g982r"] Apr 22 15:18:13.565982 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.565948 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfjm\" (UniqueName: \"kubernetes.io/projected/76a5b841-272a-4d24-ad48-081271fe8012-kube-api-access-crfjm\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.565992 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566022 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5252\" (UniqueName: \"kubernetes.io/projected/62d408bc-6296-49dc-b54d-357d46ff5596-kube-api-access-b5252\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566053 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76a5b841-272a-4d24-ad48-081271fe8012-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566077 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566103 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-sys\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566137 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.566164 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566164 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-wtmp\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566197 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-metrics-client-ca\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566228 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-root\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566242 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-tls\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566257 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566328 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-textfile\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566349 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.566440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.566421 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.667648 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667620 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crfjm\" (UniqueName: \"kubernetes.io/projected/76a5b841-272a-4d24-ad48-081271fe8012-kube-api-access-crfjm\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667652 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667677 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5252\" (UniqueName: \"kubernetes.io/projected/62d408bc-6296-49dc-b54d-357d46ff5596-kube-api-access-b5252\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667707 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76a5b841-272a-4d24-ad48-081271fe8012-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667733 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667758 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-sys\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.667794 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667780 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667838 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-sys\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667910 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-wtmp\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667935 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-metrics-client-ca\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667962 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-root\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:13.667966 2559 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.667985 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-tls\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668010 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: E0422 15:18:13.668036 2559 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls podName:76a5b841-272a-4d24-ad48-081271fe8012 nodeName:}" failed. No retries permitted until 2026-04-22 15:18:14.168016886 +0000 UTC m=+568.027898191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-g982r" (UID: "76a5b841-272a-4d24-ad48-081271fe8012") : secret "kube-state-metrics-tls" not found Apr 22 15:18:13.668113 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668098 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-textfile\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668592 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668129 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668592 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668163 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.668592 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668243 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-root\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668592 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668439 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76a5b841-272a-4d24-ad48-081271fe8012-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.668795 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668600 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-wtmp\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.668878 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668854 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.668937 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.668885 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.669409 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.669386 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-metrics-client-ca\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.669517 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.669388 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-accelerators-collector-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.669608 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.669589 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-textfile\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.670448 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.670422 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.670573 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.670561 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/62d408bc-6296-49dc-b54d-357d46ff5596-node-exporter-tls\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.670830 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.670808 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.675873 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.675849 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crfjm\" (UniqueName: \"kubernetes.io/projected/76a5b841-272a-4d24-ad48-081271fe8012-kube-api-access-crfjm\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:13.676576 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.676545 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5252\" (UniqueName: \"kubernetes.io/projected/62d408bc-6296-49dc-b54d-357d46ff5596-kube-api-access-b5252\") pod \"node-exporter-hbqsr\" (UID: \"62d408bc-6296-49dc-b54d-357d46ff5596\") " pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.761602 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:13.761544 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hbqsr" Apr 22 15:18:13.771002 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:13.770973 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d408bc_6296_49dc_b54d_357d46ff5596.slice/crio-0292900bd065db4f03401d95abb6ef24558d22aac9a18767b8c5e5ff0f1ff643 WatchSource:0}: Error finding container 0292900bd065db4f03401d95abb6ef24558d22aac9a18767b8c5e5ff0f1ff643: Status 404 returned error can't find the container with id 0292900bd065db4f03401d95abb6ef24558d22aac9a18767b8c5e5ff0f1ff643 Apr 22 15:18:14.172256 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:14.172156 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:14.174654 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:14.174631 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a5b841-272a-4d24-ad48-081271fe8012-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-g982r\" (UID: \"76a5b841-272a-4d24-ad48-081271fe8012\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:14.208988 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:14.208952 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbqsr" event={"ID":"62d408bc-6296-49dc-b54d-357d46ff5596","Type":"ContainerStarted","Data":"0292900bd065db4f03401d95abb6ef24558d22aac9a18767b8c5e5ff0f1ff643"} Apr 22 15:18:14.355543 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:14.355448 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" Apr 22 15:18:14.570272 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:14.570220 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-g982r"] Apr 22 15:18:14.572429 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:14.572404 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a5b841_272a_4d24_ad48_081271fe8012.slice/crio-13f6c6794c549089a22debc240168a0b5fd26ac82d3b7cba1c8cc6c0f15c319c WatchSource:0}: Error finding container 13f6c6794c549089a22debc240168a0b5fd26ac82d3b7cba1c8cc6c0f15c319c: Status 404 returned error can't find the container with id 13f6c6794c549089a22debc240168a0b5fd26ac82d3b7cba1c8cc6c0f15c319c Apr 22 15:18:15.214442 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:15.214405 2559 generic.go:358] "Generic (PLEG): container finished" podID="62d408bc-6296-49dc-b54d-357d46ff5596" containerID="95a3ae1a5499bd7b41c0dae0ad1061ef38cc520ccd792efcbb0423563bc66e1b" exitCode=0 Apr 22 15:18:15.214636 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:15.214477 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbqsr" event={"ID":"62d408bc-6296-49dc-b54d-357d46ff5596","Type":"ContainerDied","Data":"95a3ae1a5499bd7b41c0dae0ad1061ef38cc520ccd792efcbb0423563bc66e1b"} Apr 22 15:18:15.215574 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:15.215549 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" event={"ID":"76a5b841-272a-4d24-ad48-081271fe8012","Type":"ContainerStarted","Data":"13f6c6794c549089a22debc240168a0b5fd26ac82d3b7cba1c8cc6c0f15c319c"} Apr 22 15:18:16.220067 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.220031 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbqsr" event={"ID":"62d408bc-6296-49dc-b54d-357d46ff5596","Type":"ContainerStarted","Data":"b38e202c6c6950169f1e14b9d20e79d069e193436beb641d5cfca574ef34c922"} Apr 22 15:18:16.220067 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.220065 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hbqsr" event={"ID":"62d408bc-6296-49dc-b54d-357d46ff5596","Type":"ContainerStarted","Data":"4dbe5254ef945ca364fca2eb47d83aa2390ec7c4405cac7b1374c426b00b07b7"} Apr 22 15:18:16.221892 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.221862 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" event={"ID":"76a5b841-272a-4d24-ad48-081271fe8012","Type":"ContainerStarted","Data":"b94b6380b8d8f24930187f0ac0f81126b2d1b959961886188e9231a3766ab327"} Apr 22 15:18:16.221892 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.221889 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" event={"ID":"76a5b841-272a-4d24-ad48-081271fe8012","Type":"ContainerStarted","Data":"c9fb1d4cc2816e086fd15f18dfd1120715925f185eb1a8527bb54718ae46c542"} Apr 22 15:18:16.222036 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.221898 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" event={"ID":"76a5b841-272a-4d24-ad48-081271fe8012","Type":"ContainerStarted","Data":"cb9a6276e3d5af9fd4b38b612656d7576d5a297ee81fbe80c6ca6eff68a2c58f"} Apr 22 15:18:16.239296 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.239252 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hbqsr" podStartSLOduration=2.535813858 podStartE2EDuration="3.239240874s" podCreationTimestamp="2026-04-22 15:18:13 +0000 UTC" firstStartedPulling="2026-04-22 15:18:13.773139531 +0000 UTC m=+567.633020831" lastFinishedPulling="2026-04-22 15:18:14.47656655 +0000 UTC m=+568.336447847" observedRunningTime="2026-04-22 15:18:16.23813305 +0000 UTC m=+570.098014364" watchObservedRunningTime="2026-04-22 15:18:16.239240874 +0000 UTC m=+570.099122188" Apr 22 15:18:16.254822 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.254780 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-g982r" podStartSLOduration=2.137477939 podStartE2EDuration="3.254768381s" podCreationTimestamp="2026-04-22 15:18:13 +0000 UTC" firstStartedPulling="2026-04-22 15:18:14.574180865 +0000 UTC m=+568.434062157" lastFinishedPulling="2026-04-22 15:18:15.6914713 +0000 UTC m=+569.551352599" observedRunningTime="2026-04-22 15:18:16.253848683 +0000 UTC m=+570.113729997" watchObservedRunningTime="2026-04-22 15:18:16.254768381 +0000 UTC m=+570.114649698" Apr 22 15:18:16.421667 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.421637 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-79b5647b94-kphgj"] Apr 22 15:18:16.424176 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.424156 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.428529 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.428478 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 15:18:16.428644 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.428540 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 15:18:16.428644 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.428589 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 15:18:16.428773 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.428764 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 15:18:16.429152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.429093 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 15:18:16.429266 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.429172 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rzr94\"" Apr 22 15:18:16.429266 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.429253 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5473qje1b0vo1\"" Apr 22 15:18:16.441078 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.441059 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79b5647b94-kphgj"] Apr 22 15:18:16.494925 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.494849 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.494925 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.494877 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcad5d89-8252-4742-a4c2-55888f7bb135-metrics-client-ca\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.494925 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.494899 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.495152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.494934 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.495152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.494958 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.495152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.495013 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.495152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.495043 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-grpc-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.495152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.495058 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r46n\" (UniqueName: \"kubernetes.io/projected/fcad5d89-8252-4742-a4c2-55888f7bb135-kube-api-access-2r46n\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596112 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596154 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcad5d89-8252-4742-a4c2-55888f7bb135-metrics-client-ca\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596181 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596233 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596260 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596306 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596330 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-grpc-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.596412 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.596351 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r46n\" (UniqueName: \"kubernetes.io/projected/fcad5d89-8252-4742-a4c2-55888f7bb135-kube-api-access-2r46n\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.597179 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.597011 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fcad5d89-8252-4742-a4c2-55888f7bb135-metrics-client-ca\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.599461 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.599435 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.599880 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.599860 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.599962 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.599944 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-grpc-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.600050 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.600029 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-tls\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.600248 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.600230 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.600304 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.600249 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fcad5d89-8252-4742-a4c2-55888f7bb135-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.604116 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.604093 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r46n\" (UniqueName: \"kubernetes.io/projected/fcad5d89-8252-4742-a4c2-55888f7bb135-kube-api-access-2r46n\") pod \"thanos-querier-79b5647b94-kphgj\" (UID: \"fcad5d89-8252-4742-a4c2-55888f7bb135\") " pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.734215 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.734186 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:16.856836 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:16.856799 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79b5647b94-kphgj"] Apr 22 15:18:16.859835 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:16.859810 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcad5d89_8252_4742_a4c2_55888f7bb135.slice/crio-105411d37f88fb1e3a8912adf67ab0ff2bdd01842a8f586a90b506b472283e82 WatchSource:0}: Error finding container 105411d37f88fb1e3a8912adf67ab0ff2bdd01842a8f586a90b506b472283e82: Status 404 returned error can't find the container with id 105411d37f88fb1e3a8912adf67ab0ff2bdd01842a8f586a90b506b472283e82 Apr 22 15:18:17.225926 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.225844 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"105411d37f88fb1e3a8912adf67ab0ff2bdd01842a8f586a90b506b472283e82"} Apr 22 15:18:17.831547 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.831510 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-576679f874-8p2ck"] Apr 22 15:18:17.833996 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.833976 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.836449 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.836426 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 15:18:17.836616 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.836517 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 15:18:17.837265 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.837245 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 15:18:17.837384 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.837288 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:18:17.837384 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.837244 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-69qfc\"" Apr 22 15:18:17.837604 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.837583 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-523mmfh8ume0a\"" Apr 22 15:18:17.844860 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.844836 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-576679f874-8p2ck"] Apr 22 15:18:17.907947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.907907 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-metrics-server-audit-profiles\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.907947 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.907951 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.908163 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.907980 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6b775346-d2f3-419f-95a2-2571413caefb-audit-log\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.908163 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.908130 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-client-certs\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.908284 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.908169 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-tls\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.908284 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.908198 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-client-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:17.908284 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:17.908231 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7z5g\" (UniqueName: \"kubernetes.io/projected/6b775346-d2f3-419f-95a2-2571413caefb-kube-api-access-f7z5g\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009070 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009031 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009091 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6b775346-d2f3-419f-95a2-2571413caefb-audit-log\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009164 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-client-certs\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009191 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-tls\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009219 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-client-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009250 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009244 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7z5g\" (UniqueName: \"kubernetes.io/projected/6b775346-d2f3-419f-95a2-2571413caefb-kube-api-access-f7z5g\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009544 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009275 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-metrics-server-audit-profiles\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.009731 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.009706 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6b775346-d2f3-419f-95a2-2571413caefb-audit-log\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.010268 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.010242 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.010364 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.010282 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6b775346-d2f3-419f-95a2-2571413caefb-metrics-server-audit-profiles\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.012274 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.012253 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-client-ca-bundle\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.012365 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.012348 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-client-certs\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.012819 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.012790 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6b775346-d2f3-419f-95a2-2571413caefb-secret-metrics-server-tls\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.016838 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.016807 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7z5g\" (UniqueName: \"kubernetes.io/projected/6b775346-d2f3-419f-95a2-2571413caefb-kube-api-access-f7z5g\") pod \"metrics-server-576679f874-8p2ck\" (UID: \"6b775346-d2f3-419f-95a2-2571413caefb\") " pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.147370 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.147263 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:18.715173 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:18.715150 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-576679f874-8p2ck"] Apr 22 15:18:18.717651 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:18:18.717621 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b775346_d2f3_419f_95a2_2571413caefb.slice/crio-de1c8c6e7b543f970c469fb3434355aeff8ae347a32d20299b551bcd91f2f1d0 WatchSource:0}: Error finding container de1c8c6e7b543f970c469fb3434355aeff8ae347a32d20299b551bcd91f2f1d0: Status 404 returned error can't find the container with id de1c8c6e7b543f970c469fb3434355aeff8ae347a32d20299b551bcd91f2f1d0 Apr 22 15:18:19.239010 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:19.238917 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"dd11173315d8c5313a0b50e2661ee66319079043be4435c21e552c699c69c81e"} Apr 22 15:18:19.239010 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:19.238971 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"2c654e589541ebff5c3e23298d0b9d39d7dd2c7a0d3e0d892a78802418bb1191"} Apr 22 15:18:19.239010 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:19.238987 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"4b71c9577af9cffcdd93241952c08becc4178da97d7102e3d79bfd0eddf6942a"} Apr 22 15:18:19.240574 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:19.240528 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" event={"ID":"6b775346-d2f3-419f-95a2-2571413caefb","Type":"ContainerStarted","Data":"de1c8c6e7b543f970c469fb3434355aeff8ae347a32d20299b551bcd91f2f1d0"} Apr 22 15:18:20.244916 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.244819 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" event={"ID":"6b775346-d2f3-419f-95a2-2571413caefb","Type":"ContainerStarted","Data":"b6c65ed424fbd25853f472c5cb8699aca9b5f11472d61bd663c806d6209d2090"} Apr 22 15:18:20.247658 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.247627 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"2038f663d6641797161dd7a18c9ebd5099457075281908139db4273349e2a79a"} Apr 22 15:18:20.247658 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.247656 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"38eab4ea65be96019fa7ce5b17e93f95dbfaf3af0bdfc2b663fcd739f593d78e"} Apr 22 15:18:20.247814 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.247665 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" event={"ID":"fcad5d89-8252-4742-a4c2-55888f7bb135","Type":"ContainerStarted","Data":"edcc856314e6e0f881378d1a8536faf96f0fa40deabefffe9a3feb7337d296fd"} Apr 22 15:18:20.247814 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.247796 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:20.262787 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.262737 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" podStartSLOduration=2.007197194 podStartE2EDuration="3.262721124s" podCreationTimestamp="2026-04-22 15:18:17 +0000 UTC" firstStartedPulling="2026-04-22 15:18:18.719374005 +0000 UTC m=+572.579255300" lastFinishedPulling="2026-04-22 15:18:19.974897923 +0000 UTC m=+573.834779230" observedRunningTime="2026-04-22 15:18:20.262198203 +0000 UTC m=+574.122079519" watchObservedRunningTime="2026-04-22 15:18:20.262721124 +0000 UTC m=+574.122602441" Apr 22 15:18:20.282730 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:20.282686 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" podStartSLOduration=1.7719139240000001 podStartE2EDuration="4.282671549s" podCreationTimestamp="2026-04-22 15:18:16 +0000 UTC" firstStartedPulling="2026-04-22 15:18:16.861789978 +0000 UTC m=+570.721671271" lastFinishedPulling="2026-04-22 15:18:19.372547596 +0000 UTC m=+573.232428896" observedRunningTime="2026-04-22 15:18:20.281779403 +0000 UTC m=+574.141660730" watchObservedRunningTime="2026-04-22 15:18:20.282671549 +0000 UTC m=+574.142552864" Apr 22 15:18:26.256720 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:26.256689 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-79b5647b94-kphgj" Apr 22 15:18:27.191788 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:27.191758 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f6dccbfd7-k4p6p" Apr 22 15:18:38.148317 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:38.148273 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:38.148317 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:38.148317 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:46.621689 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:46.621659 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:18:46.622111 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:46.621780 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:18:58.153434 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:58.153405 2559 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:18:58.157109 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:18:58.157089 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-576679f874-8p2ck" Apr 22 15:20:33.123128 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.123094 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-76ngx"] Apr 22 15:20:33.125561 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.125540 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.128032 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.128013 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:20:33.138507 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.136273 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-76ngx"] Apr 22 15:20:33.234628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.234601 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-kubelet-config\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.234765 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.234636 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33bb40fc-fef3-4c17-a146-bb0689189a67-original-pull-secret\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.234765 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.234730 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-dbus\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.335511 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.335467 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-dbus\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.335644 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.335531 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-kubelet-config\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.335644 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.335553 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33bb40fc-fef3-4c17-a146-bb0689189a67-original-pull-secret\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.335711 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.335643 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-dbus\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.335711 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.335646 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33bb40fc-fef3-4c17-a146-bb0689189a67-kubelet-config\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.337698 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.337672 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33bb40fc-fef3-4c17-a146-bb0689189a67-original-pull-secret\") pod \"global-pull-secret-syncer-76ngx\" (UID: \"33bb40fc-fef3-4c17-a146-bb0689189a67\") " pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.434867 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.434798 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-76ngx" Apr 22 15:20:33.547146 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.547119 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-76ngx"] Apr 22 15:20:33.550309 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:20:33.550279 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33bb40fc_fef3_4c17_a146_bb0689189a67.slice/crio-7c3e5d7696e3ac373f887f53029344fe1474c15702828a0347d835f4d389efc1 WatchSource:0}: Error finding container 7c3e5d7696e3ac373f887f53029344fe1474c15702828a0347d835f4d389efc1: Status 404 returned error can't find the container with id 7c3e5d7696e3ac373f887f53029344fe1474c15702828a0347d835f4d389efc1 Apr 22 15:20:33.614673 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:33.614643 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-76ngx" event={"ID":"33bb40fc-fef3-4c17-a146-bb0689189a67","Type":"ContainerStarted","Data":"7c3e5d7696e3ac373f887f53029344fe1474c15702828a0347d835f4d389efc1"} Apr 22 15:20:38.632264 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:38.632228 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-76ngx" event={"ID":"33bb40fc-fef3-4c17-a146-bb0689189a67","Type":"ContainerStarted","Data":"8a396a4f0c89183c53ef625382d25951cf918c54e8657b1e8f4ac3be48712436"} Apr 22 15:20:38.653599 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:20:38.653538 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-76ngx" podStartSLOduration=1.649972808 podStartE2EDuration="5.653522952s" podCreationTimestamp="2026-04-22 15:20:33 +0000 UTC" firstStartedPulling="2026-04-22 15:20:33.551931262 +0000 UTC m=+707.411812556" lastFinishedPulling="2026-04-22 15:20:37.555481388 +0000 UTC m=+711.415362700" observedRunningTime="2026-04-22 15:20:38.651784032 +0000 UTC m=+712.511665347" watchObservedRunningTime="2026-04-22 15:20:38.653522952 +0000 UTC m=+712.513404266" Apr 22 15:22:15.366867 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.366789 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj"] Apr 22 15:22:15.369893 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.369876 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.372945 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.372926 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 15:22:15.373933 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.373907 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-trsvj\"" Apr 22 15:22:15.374016 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.373907 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 15:22:15.386937 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.386914 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj"] Apr 22 15:22:15.503392 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.503354 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p56d\" (UniqueName: \"kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.503589 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.503413 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.503589 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.503504 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.603879 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.603846 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.604051 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.603898 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.604051 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.603929 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p56d\" (UniqueName: \"kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.604279 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.604256 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.604352 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.604289 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.612836 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.612812 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p56d\" (UniqueName: \"kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.679152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.679098 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:15.795594 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.795569 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj"] Apr 22 15:22:15.797957 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:22:15.797930 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940598c6_8f7d_42c9_8b52_428b8b6702af.slice/crio-1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f WatchSource:0}: Error finding container 1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f: Status 404 returned error can't find the container with id 1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f Apr 22 15:22:15.899635 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:15.899593 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" event={"ID":"940598c6-8f7d-42c9-8b52-428b8b6702af","Type":"ContainerStarted","Data":"1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f"} Apr 22 15:22:21.918882 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:21.918850 2559 generic.go:358] "Generic (PLEG): container finished" podID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerID="f6ffa73f828af6de279075c57e85495395f9be26048ca4e38b0ad30f3491d7bc" exitCode=0 Apr 22 15:22:21.919286 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:21.918904 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" event={"ID":"940598c6-8f7d-42c9-8b52-428b8b6702af","Type":"ContainerDied","Data":"f6ffa73f828af6de279075c57e85495395f9be26048ca4e38b0ad30f3491d7bc"} Apr 22 15:22:23.926938 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:23.926912 2559 generic.go:358] "Generic (PLEG): container finished" podID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerID="1db319dc04d348de49070f8cdacbd83a3900ea2abbaef2711d2a31c3c38c92c9" exitCode=0 Apr 22 15:22:23.927234 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:23.926989 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" event={"ID":"940598c6-8f7d-42c9-8b52-428b8b6702af","Type":"ContainerDied","Data":"1db319dc04d348de49070f8cdacbd83a3900ea2abbaef2711d2a31c3c38c92c9"} Apr 22 15:22:32.958710 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:32.958669 2559 generic.go:358] "Generic (PLEG): container finished" podID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerID="6a1f0093f6df93e26f9a3f077916031ed72bf29ff95fe58e46310aaf32f5165a" exitCode=0 Apr 22 15:22:32.959080 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:32.958738 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" event={"ID":"940598c6-8f7d-42c9-8b52-428b8b6702af","Type":"ContainerDied","Data":"6a1f0093f6df93e26f9a3f077916031ed72bf29ff95fe58e46310aaf32f5165a"} Apr 22 15:22:34.076336 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.076314 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:34.153806 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.153774 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p56d\" (UniqueName: \"kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d\") pod \"940598c6-8f7d-42c9-8b52-428b8b6702af\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " Apr 22 15:22:34.153957 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.153850 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle\") pod \"940598c6-8f7d-42c9-8b52-428b8b6702af\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " Apr 22 15:22:34.153957 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.153876 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util\") pod \"940598c6-8f7d-42c9-8b52-428b8b6702af\" (UID: \"940598c6-8f7d-42c9-8b52-428b8b6702af\") " Apr 22 15:22:34.154538 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.154511 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle" (OuterVolumeSpecName: "bundle") pod "940598c6-8f7d-42c9-8b52-428b8b6702af" (UID: "940598c6-8f7d-42c9-8b52-428b8b6702af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:22:34.155928 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.155901 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d" (OuterVolumeSpecName: "kube-api-access-5p56d") pod "940598c6-8f7d-42c9-8b52-428b8b6702af" (UID: "940598c6-8f7d-42c9-8b52-428b8b6702af"). InnerVolumeSpecName "kube-api-access-5p56d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:22:34.159370 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.159344 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util" (OuterVolumeSpecName: "util") pod "940598c6-8f7d-42c9-8b52-428b8b6702af" (UID: "940598c6-8f7d-42c9-8b52-428b8b6702af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:22:34.255108 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.255020 2559 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-bundle\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:22:34.255108 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.255059 2559 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/940598c6-8f7d-42c9-8b52-428b8b6702af-util\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:22:34.255108 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.255070 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5p56d\" (UniqueName: \"kubernetes.io/projected/940598c6-8f7d-42c9-8b52-428b8b6702af-kube-api-access-5p56d\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:22:34.965852 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.965814 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" event={"ID":"940598c6-8f7d-42c9-8b52-428b8b6702af","Type":"ContainerDied","Data":"1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f"} Apr 22 15:22:34.965852 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.965856 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1510927c6fd5f1f59340393b214ce4b25601ece058e0e732e9f27925cc15d61f" Apr 22 15:22:34.966093 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:34.965829 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19djxpcj" Apr 22 15:22:38.404117 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404091 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n"] Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404362 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="util" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404372 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="util" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404395 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="extract" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404400 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="extract" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404407 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="pull" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404412 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="pull" Apr 22 15:22:38.404506 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.404454 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="940598c6-8f7d-42c9-8b52-428b8b6702af" containerName="extract" Apr 22 15:22:38.453905 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.453867 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n"] Apr 22 15:22:38.454047 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.453986 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.459729 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.459704 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:22:38.459872 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.459745 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 15:22:38.459994 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.459976 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-7n45b\"" Apr 22 15:22:38.485936 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.485904 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.486065 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.485942 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd96t\" (UniqueName: \"kubernetes.io/projected/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-kube-api-access-hd96t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.586247 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.586217 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.586374 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.586259 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd96t\" (UniqueName: \"kubernetes.io/projected/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-kube-api-access-hd96t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.586593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.586574 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.595668 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.595638 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd96t\" (UniqueName: \"kubernetes.io/projected/c67a5ebd-7fd8-4a89-909c-f53bbc002fc4-kube-api-access-hd96t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-9kt6n\" (UID: \"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.762684 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.762651 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" Apr 22 15:22:38.897570 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.897541 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n"] Apr 22 15:22:38.899548 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:22:38.899517 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67a5ebd_7fd8_4a89_909c_f53bbc002fc4.slice/crio-4ce17f462a676f95c85d7d163d02e13c9bf32b8ff8489939ab9e0287bd89379b WatchSource:0}: Error finding container 4ce17f462a676f95c85d7d163d02e13c9bf32b8ff8489939ab9e0287bd89379b: Status 404 returned error can't find the container with id 4ce17f462a676f95c85d7d163d02e13c9bf32b8ff8489939ab9e0287bd89379b Apr 22 15:22:38.978857 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:38.978821 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" event={"ID":"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4","Type":"ContainerStarted","Data":"4ce17f462a676f95c85d7d163d02e13c9bf32b8ff8489939ab9e0287bd89379b"} Apr 22 15:22:41.991227 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:41.991190 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" event={"ID":"c67a5ebd-7fd8-4a89-909c-f53bbc002fc4","Type":"ContainerStarted","Data":"5296c635bdd5cbf1882b0c60c4dfad7a68bf744bf80db7606a67c65e5e10c70d"} Apr 22 15:22:42.012867 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:22:42.012822 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-9kt6n" podStartSLOduration=1.886230878 podStartE2EDuration="4.012806179s" podCreationTimestamp="2026-04-22 15:22:38 +0000 UTC" firstStartedPulling="2026-04-22 15:22:38.901942288 +0000 UTC m=+832.761823581" lastFinishedPulling="2026-04-22 15:22:41.028517584 +0000 UTC m=+834.888398882" observedRunningTime="2026-04-22 15:22:42.01223608 +0000 UTC m=+835.872117395" watchObservedRunningTime="2026-04-22 15:22:42.012806179 +0000 UTC m=+835.872687493" Apr 22 15:23:02.225313 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.225272 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4zhq7"] Apr 22 15:23:02.228999 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.228981 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.270539 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.270512 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4zhq7"] Apr 22 15:23:02.283884 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.283858 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 15:23:02.285083 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.285063 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 15:23:02.285763 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.285744 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-85kwh\"" Apr 22 15:23:02.389880 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.389845 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssj5\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-kube-api-access-kssj5\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.389880 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.389881 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-bound-sa-token\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.490371 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.490281 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kssj5\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-kube-api-access-kssj5\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.490371 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.490321 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-bound-sa-token\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.498700 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.498671 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-bound-sa-token\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.498904 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.498886 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssj5\" (UniqueName: \"kubernetes.io/projected/0edf6c9c-cce2-4742-a911-f47476d063ee-kube-api-access-kssj5\") pod \"cert-manager-79c8d999ff-4zhq7\" (UID: \"0edf6c9c-cce2-4742-a911-f47476d063ee\") " pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.549966 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.549916 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-4zhq7" Apr 22 15:23:02.670501 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:02.670436 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4zhq7"] Apr 22 15:23:02.672920 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:23:02.672893 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edf6c9c_cce2_4742_a911_f47476d063ee.slice/crio-213ad2682267b21ad765f5667fc0e5f3128c2fd4a290540f47b436c8fbd463c2 WatchSource:0}: Error finding container 213ad2682267b21ad765f5667fc0e5f3128c2fd4a290540f47b436c8fbd463c2: Status 404 returned error can't find the container with id 213ad2682267b21ad765f5667fc0e5f3128c2fd4a290540f47b436c8fbd463c2 Apr 22 15:23:03.054773 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:03.054737 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-4zhq7" event={"ID":"0edf6c9c-cce2-4742-a911-f47476d063ee","Type":"ContainerStarted","Data":"213ad2682267b21ad765f5667fc0e5f3128c2fd4a290540f47b436c8fbd463c2"} Apr 22 15:23:06.067035 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.066997 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-4zhq7" event={"ID":"0edf6c9c-cce2-4742-a911-f47476d063ee","Type":"ContainerStarted","Data":"75561d83ce019091c4021816cbe596fc5c0a3816340cad2dabb35f0d38096f54"} Apr 22 15:23:06.083922 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.083810 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-4zhq7" podStartSLOduration=1.127260653 podStartE2EDuration="4.08379479s" podCreationTimestamp="2026-04-22 15:23:02 +0000 UTC" firstStartedPulling="2026-04-22 15:23:02.675154557 +0000 UTC m=+856.535035851" lastFinishedPulling="2026-04-22 15:23:05.631688679 +0000 UTC m=+859.491569988" observedRunningTime="2026-04-22 15:23:06.083227325 +0000 UTC m=+859.943108641" watchObservedRunningTime="2026-04-22 15:23:06.08379479 +0000 UTC m=+859.943676106" Apr 22 15:23:06.511133 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.511100 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b"] Apr 22 15:23:06.513687 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.513669 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.518411 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.518387 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 15:23:06.518553 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.518536 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 15:23:06.519193 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.519172 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-trsvj\"" Apr 22 15:23:06.525554 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.525534 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b"] Apr 22 15:23:06.623233 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.623205 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.623373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.623267 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqtd\" (UniqueName: \"kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.623373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.623291 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.724598 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.724563 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqtd\" (UniqueName: \"kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.724775 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.724610 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.724775 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.724686 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.725014 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.724988 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.725059 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.725013 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.732984 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.732956 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqtd\" (UniqueName: \"kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.823047 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.822978 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:06.937917 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.937879 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b"] Apr 22 15:23:06.941108 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:23:06.941082 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0b1819_d932_48a0_a732_6a7db51b62be.slice/crio-1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df WatchSource:0}: Error finding container 1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df: Status 404 returned error can't find the container with id 1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df Apr 22 15:23:06.942823 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:06.942808 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:23:07.071120 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:07.071089 2559 generic.go:358] "Generic (PLEG): container finished" podID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerID="1ce73cfcc1986d8f59977f907fa30d3816b0f6eb4c5a78c7c22c564fec15dec4" exitCode=0 Apr 22 15:23:07.071529 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:07.071171 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" event={"ID":"bf0b1819-d932-48a0-a732-6a7db51b62be","Type":"ContainerDied","Data":"1ce73cfcc1986d8f59977f907fa30d3816b0f6eb4c5a78c7c22c564fec15dec4"} Apr 22 15:23:07.071529 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:07.071204 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" event={"ID":"bf0b1819-d932-48a0-a732-6a7db51b62be","Type":"ContainerStarted","Data":"1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df"} Apr 22 15:23:09.079280 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:09.079193 2559 generic.go:358] "Generic (PLEG): container finished" podID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerID="73aa774e77d718f7c40db0b97e614bb3a7ffa7f1825b52882f0b715ab68d7f92" exitCode=0 Apr 22 15:23:09.079716 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:09.079280 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" event={"ID":"bf0b1819-d932-48a0-a732-6a7db51b62be","Type":"ContainerDied","Data":"73aa774e77d718f7c40db0b97e614bb3a7ffa7f1825b52882f0b715ab68d7f92"} Apr 22 15:23:10.084705 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:10.084671 2559 generic.go:358] "Generic (PLEG): container finished" podID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerID="a905592844896637327efdef4e968c5f99849af6fe1ac193e01c94ee67614d2f" exitCode=0 Apr 22 15:23:10.085038 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:10.084746 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" event={"ID":"bf0b1819-d932-48a0-a732-6a7db51b62be","Type":"ContainerDied","Data":"a905592844896637327efdef4e968c5f99849af6fe1ac193e01c94ee67614d2f"} Apr 22 15:23:11.208564 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.208541 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:11.363277 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.363186 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle\") pod \"bf0b1819-d932-48a0-a732-6a7db51b62be\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " Apr 22 15:23:11.363406 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.363279 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cqtd\" (UniqueName: \"kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd\") pod \"bf0b1819-d932-48a0-a732-6a7db51b62be\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " Apr 22 15:23:11.363406 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.363324 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util\") pod \"bf0b1819-d932-48a0-a732-6a7db51b62be\" (UID: \"bf0b1819-d932-48a0-a732-6a7db51b62be\") " Apr 22 15:23:11.363669 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.363644 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle" (OuterVolumeSpecName: "bundle") pod "bf0b1819-d932-48a0-a732-6a7db51b62be" (UID: "bf0b1819-d932-48a0-a732-6a7db51b62be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:23:11.365310 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.365282 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd" (OuterVolumeSpecName: "kube-api-access-4cqtd") pod "bf0b1819-d932-48a0-a732-6a7db51b62be" (UID: "bf0b1819-d932-48a0-a732-6a7db51b62be"). InnerVolumeSpecName "kube-api-access-4cqtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:23:11.370392 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.370368 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util" (OuterVolumeSpecName: "util") pod "bf0b1819-d932-48a0-a732-6a7db51b62be" (UID: "bf0b1819-d932-48a0-a732-6a7db51b62be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:23:11.463851 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.463814 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cqtd\" (UniqueName: \"kubernetes.io/projected/bf0b1819-d932-48a0-a732-6a7db51b62be-kube-api-access-4cqtd\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:23:11.463851 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.463845 2559 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-util\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:23:11.464038 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:11.463859 2559 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf0b1819-d932-48a0-a732-6a7db51b62be-bundle\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:23:12.092402 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:12.092370 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" event={"ID":"bf0b1819-d932-48a0-a732-6a7db51b62be","Type":"ContainerDied","Data":"1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df"} Apr 22 15:23:12.092402 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:12.092402 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bc5d988a276088f87837b4e64650e8d7934641253a46c4e43f08a57e8e1a2df" Apr 22 15:23:12.092615 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:12.092422 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78efxm8b" Apr 22 15:23:18.340230 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340196 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm"] Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340534 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="util" Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340545 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="util" Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340561 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="extract" Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340566 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="extract" Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340575 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="pull" Apr 22 15:23:18.340628 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340580 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="pull" Apr 22 15:23:18.340811 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.340634 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf0b1819-d932-48a0-a732-6a7db51b62be" containerName="extract" Apr 22 15:23:18.343228 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.343210 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.345894 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.345842 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 22 15:23:18.346031 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.346016 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-hc8z5\"" Apr 22 15:23:18.346819 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.346803 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:23:18.354114 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.354085 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm"] Apr 22 15:23:18.419120 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.419088 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/672711fe-41ee-42d5-b5be-01cf8ebf3933-tmp\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.419284 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.419126 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wfn5\" (UniqueName: \"kubernetes.io/projected/672711fe-41ee-42d5-b5be-01cf8ebf3933-kube-api-access-9wfn5\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.519569 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.519530 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/672711fe-41ee-42d5-b5be-01cf8ebf3933-tmp\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.519569 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.519570 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wfn5\" (UniqueName: \"kubernetes.io/projected/672711fe-41ee-42d5-b5be-01cf8ebf3933-kube-api-access-9wfn5\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.519946 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.519926 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/672711fe-41ee-42d5-b5be-01cf8ebf3933-tmp\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.528373 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.528344 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wfn5\" (UniqueName: \"kubernetes.io/projected/672711fe-41ee-42d5-b5be-01cf8ebf3933-kube-api-access-9wfn5\") pod \"jobset-operator-747c5859c7-jjsvm\" (UID: \"672711fe-41ee-42d5-b5be-01cf8ebf3933\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.652218 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.652144 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" Apr 22 15:23:18.777389 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:18.777356 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm"] Apr 22 15:23:18.780873 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:23:18.780844 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod672711fe_41ee_42d5_b5be_01cf8ebf3933.slice/crio-a070ccecf6e04479cce0a34fedd3d98918d7f3e4f4908b608d36bda86994596c WatchSource:0}: Error finding container a070ccecf6e04479cce0a34fedd3d98918d7f3e4f4908b608d36bda86994596c: Status 404 returned error can't find the container with id a070ccecf6e04479cce0a34fedd3d98918d7f3e4f4908b608d36bda86994596c Apr 22 15:23:19.114309 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:19.114274 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" event={"ID":"672711fe-41ee-42d5-b5be-01cf8ebf3933","Type":"ContainerStarted","Data":"a070ccecf6e04479cce0a34fedd3d98918d7f3e4f4908b608d36bda86994596c"} Apr 22 15:23:21.123241 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:21.123194 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" event={"ID":"672711fe-41ee-42d5-b5be-01cf8ebf3933","Type":"ContainerStarted","Data":"7560a0f2ca468e9a4bd02521760742491054f82371908bde7475e03c23f9545b"} Apr 22 15:23:21.142543 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:21.142474 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-jjsvm" podStartSLOduration=1.296200163 podStartE2EDuration="3.142457844s" podCreationTimestamp="2026-04-22 15:23:18 +0000 UTC" firstStartedPulling="2026-04-22 15:23:18.782774972 +0000 UTC m=+872.642656264" lastFinishedPulling="2026-04-22 15:23:20.629032637 +0000 UTC m=+874.488913945" observedRunningTime="2026-04-22 15:23:21.141756349 +0000 UTC m=+875.001637664" watchObservedRunningTime="2026-04-22 15:23:21.142457844 +0000 UTC m=+875.002339160" Apr 22 15:23:29.529904 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.529868 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg"] Apr 22 15:23:29.532465 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.532446 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.534867 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.534847 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 22 15:23:29.535877 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.535859 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 22 15:23:29.535993 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.535974 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-dgcgn\"" Apr 22 15:23:29.536083 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.536007 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 22 15:23:29.541462 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.541443 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg"] Apr 22 15:23:29.607123 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.607094 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7kt8\" (UniqueName: \"kubernetes.io/projected/0ba5e2f3-2767-40eb-8524-bdf904ee150e-kube-api-access-v7kt8\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.607258 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.607218 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-metrics-certs\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.607331 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.607267 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ba5e2f3-2767-40eb-8524-bdf904ee150e-manager-config\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.607331 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.607312 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-cert\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.707820 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.707787 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-metrics-certs\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.707820 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.707822 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ba5e2f3-2767-40eb-8524-bdf904ee150e-manager-config\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.708032 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.707848 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-cert\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.708032 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.707896 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7kt8\" (UniqueName: \"kubernetes.io/projected/0ba5e2f3-2767-40eb-8524-bdf904ee150e-kube-api-access-v7kt8\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.708634 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.708603 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ba5e2f3-2767-40eb-8524-bdf904ee150e-manager-config\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.710361 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.710331 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-cert\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.710450 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.710371 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ba5e2f3-2767-40eb-8524-bdf904ee150e-metrics-certs\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.717358 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.717328 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7kt8\" (UniqueName: \"kubernetes.io/projected/0ba5e2f3-2767-40eb-8524-bdf904ee150e-kube-api-access-v7kt8\") pod \"jobset-controller-manager-5d86bd95b-82mcg\" (UID: \"0ba5e2f3-2767-40eb-8524-bdf904ee150e\") " pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.862390 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.862294 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:29.977744 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:29.977705 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg"] Apr 22 15:23:29.980934 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:23:29.980906 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba5e2f3_2767_40eb_8524_bdf904ee150e.slice/crio-e35b875ccc4a488adcd8de87eb7da9a9c198096f9cd0f80fc86c366df43cccec WatchSource:0}: Error finding container e35b875ccc4a488adcd8de87eb7da9a9c198096f9cd0f80fc86c366df43cccec: Status 404 returned error can't find the container with id e35b875ccc4a488adcd8de87eb7da9a9c198096f9cd0f80fc86c366df43cccec Apr 22 15:23:30.151943 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:30.151874 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" event={"ID":"0ba5e2f3-2767-40eb-8524-bdf904ee150e","Type":"ContainerStarted","Data":"e35b875ccc4a488adcd8de87eb7da9a9c198096f9cd0f80fc86c366df43cccec"} Apr 22 15:23:34.165726 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:34.165690 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" event={"ID":"0ba5e2f3-2767-40eb-8524-bdf904ee150e","Type":"ContainerStarted","Data":"bdf12cebbbd093060c9447f754b1309ef873c767002d23d9a4fafe3adaba58a8"} Apr 22 15:23:34.166121 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:34.165811 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:34.182958 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:34.182915 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" podStartSLOduration=1.171816096 podStartE2EDuration="5.182900268s" podCreationTimestamp="2026-04-22 15:23:29 +0000 UTC" firstStartedPulling="2026-04-22 15:23:29.982588424 +0000 UTC m=+883.842469717" lastFinishedPulling="2026-04-22 15:23:33.993672596 +0000 UTC m=+887.853553889" observedRunningTime="2026-04-22 15:23:34.181725422 +0000 UTC m=+888.041606736" watchObservedRunningTime="2026-04-22 15:23:34.182900268 +0000 UTC m=+888.042781583" Apr 22 15:23:45.174866 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:45.174831 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-5d86bd95b-82mcg" Apr 22 15:23:46.645943 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:46.645906 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:23:46.646436 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:23:46.646418 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:28:46.669351 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:28:46.669319 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:28:46.669909 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:28:46.669413 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:29:09.561888 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.561852 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4"] Apr 22 15:29:09.565067 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.565049 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:29:09.567656 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.567633 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"kube-root-ca.crt\"" Apr 22 15:29:09.567780 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.567715 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"default-dockercfg-sbhg8\"" Apr 22 15:29:09.567780 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.567758 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"openshift-service-ca.crt\"" Apr 22 15:29:09.575548 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.575527 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4"] Apr 22 15:29:09.742344 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.742302 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blg2\" (UniqueName: \"kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2\") pod \"progression-job-failure-node-0-0-rstz4\" (UID: \"341bb41b-6ab1-45d2-b935-fab4ff25870d\") " pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:29:09.843800 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.843728 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6blg2\" (UniqueName: \"kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2\") pod \"progression-job-failure-node-0-0-rstz4\" (UID: \"341bb41b-6ab1-45d2-b935-fab4ff25870d\") " pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:29:09.851832 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.851804 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blg2\" (UniqueName: \"kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2\") pod \"progression-job-failure-node-0-0-rstz4\" (UID: \"341bb41b-6ab1-45d2-b935-fab4ff25870d\") " pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:29:09.875218 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.875187 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:29:09.991992 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.991968 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4"] Apr 22 15:29:09.994220 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:29:09.994190 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341bb41b_6ab1_45d2_b935_fab4ff25870d.slice/crio-6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9 WatchSource:0}: Error finding container 6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9: Status 404 returned error can't find the container with id 6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9 Apr 22 15:29:09.996575 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:09.996559 2559 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:29:10.244788 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:29:10.244752 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" event={"ID":"341bb41b-6ab1-45d2-b935-fab4ff25870d","Type":"ContainerStarted","Data":"6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9"} Apr 22 15:31:03.653370 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:03.653284 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" event={"ID":"341bb41b-6ab1-45d2-b935-fab4ff25870d","Type":"ContainerStarted","Data":"8f16d84c3d6afb9d83eb5957970cd897de4b518a11371b0ca56a9af7abb39307"} Apr 22 15:31:03.653864 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:03.653405 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:31:03.674769 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:03.674688 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" podStartSLOduration=1.389507807 podStartE2EDuration="1m54.674672438s" podCreationTimestamp="2026-04-22 15:29:09 +0000 UTC" firstStartedPulling="2026-04-22 15:29:09.996701725 +0000 UTC m=+1223.856583018" lastFinishedPulling="2026-04-22 15:31:03.281866345 +0000 UTC m=+1337.141747649" observedRunningTime="2026-04-22 15:31:03.674397191 +0000 UTC m=+1337.534278506" watchObservedRunningTime="2026-04-22 15:31:03.674672438 +0000 UTC m=+1337.534553754" Apr 22 15:31:05.660247 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:05.660219 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:31:12.658110 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:12.658048 2559 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" podUID="341bb41b-6ab1-45d2-b935-fab4ff25870d" containerName="node" probeResult="failure" output="Get \"http://10.134.0.25:28080/metrics\": dial tcp 10.134.0.25:28080: connect: connection refused" Apr 22 15:31:12.685921 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:12.685890 2559 generic.go:358] "Generic (PLEG): container finished" podID="341bb41b-6ab1-45d2-b935-fab4ff25870d" containerID="8f16d84c3d6afb9d83eb5957970cd897de4b518a11371b0ca56a9af7abb39307" exitCode=1 Apr 22 15:31:12.686065 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:12.685930 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" event={"ID":"341bb41b-6ab1-45d2-b935-fab4ff25870d","Type":"ContainerDied","Data":"8f16d84c3d6afb9d83eb5957970cd897de4b518a11371b0ca56a9af7abb39307"} Apr 22 15:31:13.808652 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:13.808629 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:31:13.914044 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:13.914016 2559 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6blg2\" (UniqueName: \"kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2\") pod \"341bb41b-6ab1-45d2-b935-fab4ff25870d\" (UID: \"341bb41b-6ab1-45d2-b935-fab4ff25870d\") " Apr 22 15:31:13.916166 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:13.916142 2559 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2" (OuterVolumeSpecName: "kube-api-access-6blg2") pod "341bb41b-6ab1-45d2-b935-fab4ff25870d" (UID: "341bb41b-6ab1-45d2-b935-fab4ff25870d"). InnerVolumeSpecName "kube-api-access-6blg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:31:14.014991 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:14.014899 2559 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6blg2\" (UniqueName: \"kubernetes.io/projected/341bb41b-6ab1-45d2-b935-fab4ff25870d-kube-api-access-6blg2\") on node \"ip-10-0-141-167.ec2.internal\" DevicePath \"\"" Apr 22 15:31:14.694018 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:14.693986 2559 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" Apr 22 15:31:14.694018 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:14.694004 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4" event={"ID":"341bb41b-6ab1-45d2-b935-fab4ff25870d","Type":"ContainerDied","Data":"6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9"} Apr 22 15:31:14.694222 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:14.694032 2559 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6841fa570a7120d8e7181de4240d0c8543f647b78917f4c98100e4940e1dd1f9" Apr 22 15:31:32.788053 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:32.788015 2559 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4"] Apr 22 15:31:32.791242 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:32.791217 2559 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-job-failure-node-0-0-rstz4"] Apr 22 15:31:34.712818 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:34.712784 2559 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341bb41b-6ab1-45d2-b935-fab4ff25870d" path="/var/lib/kubelet/pods/341bb41b-6ab1-45d2-b935-fab4ff25870d/volumes" Apr 22 15:31:58.043270 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:58.043242 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-76ngx_33bb40fc-fef3-4c17-a146-bb0689189a67/global-pull-secret-syncer/0.log" Apr 22 15:31:58.129542 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:58.129513 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7lh45_bf285aa1-3778-4781-989b-35abe7a6ed8f/konnectivity-agent/0.log" Apr 22 15:31:58.247189 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:31:58.247159 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-167.ec2.internal_bc91d8338dec73ddfaae16ceba9bdb9c/haproxy/0.log" Apr 22 15:32:01.619680 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.619649 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-g982r_76a5b841-272a-4d24-ad48-081271fe8012/kube-state-metrics/0.log" Apr 22 15:32:01.641540 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.641480 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-g982r_76a5b841-272a-4d24-ad48-081271fe8012/kube-rbac-proxy-main/0.log" Apr 22 15:32:01.663011 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.662987 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-g982r_76a5b841-272a-4d24-ad48-081271fe8012/kube-rbac-proxy-self/0.log" Apr 22 15:32:01.700243 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.700223 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-576679f874-8p2ck_6b775346-d2f3-419f-95a2-2571413caefb/metrics-server/0.log" Apr 22 15:32:01.947650 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.947624 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbqsr_62d408bc-6296-49dc-b54d-357d46ff5596/node-exporter/0.log" Apr 22 15:32:01.970256 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.970237 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbqsr_62d408bc-6296-49dc-b54d-357d46ff5596/kube-rbac-proxy/0.log" Apr 22 15:32:01.994134 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:01.994113 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hbqsr_62d408bc-6296-49dc-b54d-357d46ff5596/init-textfile/0.log" Apr 22 15:32:02.444440 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.444418 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-mdwnn_dd601ac1-c12d-43f5-88ff-09b6c5652c26/prometheus-operator-admission-webhook/0.log" Apr 22 15:32:02.604480 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.604448 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/thanos-query/0.log" Apr 22 15:32:02.640818 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.640796 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/kube-rbac-proxy-web/0.log" Apr 22 15:32:02.665703 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.665681 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/kube-rbac-proxy/0.log" Apr 22 15:32:02.689281 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.689260 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/prom-label-proxy/0.log" Apr 22 15:32:02.716977 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.716955 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/kube-rbac-proxy-rules/0.log" Apr 22 15:32:02.741749 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:02.741729 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79b5647b94-kphgj_fcad5d89-8252-4742-a4c2-55888f7bb135/kube-rbac-proxy-metrics/0.log" Apr 22 15:32:04.717544 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.717515 2559 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd"] Apr 22 15:32:04.717958 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.717845 2559 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="341bb41b-6ab1-45d2-b935-fab4ff25870d" containerName="node" Apr 22 15:32:04.717958 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.717856 2559 state_mem.go:107] "Deleted CPUSet assignment" podUID="341bb41b-6ab1-45d2-b935-fab4ff25870d" containerName="node" Apr 22 15:32:04.717958 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.717908 2559 memory_manager.go:356] "RemoveStaleState removing state" podUID="341bb41b-6ab1-45d2-b935-fab4ff25870d" containerName="node" Apr 22 15:32:04.719984 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.719964 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.722344 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.722323 2559 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nrz24\"/\"default-dockercfg-klhx2\"" Apr 22 15:32:04.723170 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.723155 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nrz24\"/\"kube-root-ca.crt\"" Apr 22 15:32:04.723226 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.723208 2559 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nrz24\"/\"openshift-service-ca.crt\"" Apr 22 15:32:04.727911 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.727885 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd"] Apr 22 15:32:04.732322 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.732303 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-sys\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.732419 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.732331 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-proc\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.732419 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.732360 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbmsb\" (UniqueName: \"kubernetes.io/projected/3ec23977-4867-485b-85e7-80fcd22bc287-kube-api-access-cbmsb\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.732419 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.732380 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-podres\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.732558 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.732447 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-lib-modules\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833343 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833309 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-sys\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833343 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833346 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-proc\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833386 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbmsb\" (UniqueName: \"kubernetes.io/projected/3ec23977-4867-485b-85e7-80fcd22bc287-kube-api-access-cbmsb\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833412 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-podres\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833425 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-sys\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833444 2559 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-lib-modules\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833511 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-proc\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833570 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-podres\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.833593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.833590 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec23977-4867-485b-85e7-80fcd22bc287-lib-modules\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:04.841402 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:04.841382 2559 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbmsb\" (UniqueName: \"kubernetes.io/projected/3ec23977-4867-485b-85e7-80fcd22bc287-kube-api-access-cbmsb\") pod \"perf-node-gather-daemonset-bvmcd\" (UID: \"3ec23977-4867-485b-85e7-80fcd22bc287\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:05.031000 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.030910 2559 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:05.147290 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.147248 2559 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd"] Apr 22 15:32:05.150767 ip-10-0-141-167 kubenswrapper[2559]: W0422 15:32:05.150739 2559 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ec23977_4867_485b_85e7_80fcd22bc287.slice/crio-dab9a9f9f90f9da3a6d3a717ecdc24e6e9e6a54c286183b2f3341bb78f9e2b43 WatchSource:0}: Error finding container dab9a9f9f90f9da3a6d3a717ecdc24e6e9e6a54c286183b2f3341bb78f9e2b43: Status 404 returned error can't find the container with id dab9a9f9f90f9da3a6d3a717ecdc24e6e9e6a54c286183b2f3341bb78f9e2b43 Apr 22 15:32:05.674036 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.674010 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmsst_57d3cf97-71a9-44b9-80ac-439b558c8604/dns/0.log" Apr 22 15:32:05.694205 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.694181 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cmsst_57d3cf97-71a9-44b9-80ac-439b558c8604/kube-rbac-proxy/0.log" Apr 22 15:32:05.816224 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.816190 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p6k5v_d54da891-e22c-4f12-9dfb-e2ada408f1ea/dns-node-resolver/0.log" Apr 22 15:32:05.862417 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.862385 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" event={"ID":"3ec23977-4867-485b-85e7-80fcd22bc287","Type":"ContainerStarted","Data":"03645966f1eede21ccf3bd8229abcbc94a1e77faa030fa55fa67807c80a9a5cf"} Apr 22 15:32:05.862417 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.862415 2559 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" event={"ID":"3ec23977-4867-485b-85e7-80fcd22bc287","Type":"ContainerStarted","Data":"dab9a9f9f90f9da3a6d3a717ecdc24e6e9e6a54c286183b2f3341bb78f9e2b43"} Apr 22 15:32:05.862656 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.862431 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:05.878847 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:05.878799 2559 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" podStartSLOduration=1.878783417 podStartE2EDuration="1.878783417s" podCreationTimestamp="2026-04-22 15:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:32:05.876957048 +0000 UTC m=+1399.736838364" watchObservedRunningTime="2026-04-22 15:32:05.878783417 +0000 UTC m=+1399.738664764" Apr 22 15:32:06.221180 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:06.221148 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-f6dccbfd7-k4p6p_79ee7cdf-2a19-48bb-ba79-f272ff00571f/registry/0.log" Apr 22 15:32:06.265605 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:06.265578 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tqshj_69f41c04-4f2b-476d-aac0-a60437dfe0c5/node-ca/0.log" Apr 22 15:32:07.001023 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:07.000994 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-589c889464-99f7x_b6892fe2-bf35-4568-aa5b-cb281e67c570/router/0.log" Apr 22 15:32:07.344565 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:07.344470 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ffzqh_b464f475-8761-403b-bc74-03f05e7d52c5/serve-healthcheck-canary/0.log" Apr 22 15:32:07.788593 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:07.788565 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g58gj_c5a2dfa4-9885-4f86-9d28-6d2f547d5123/kube-rbac-proxy/0.log" Apr 22 15:32:07.810117 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:07.810096 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g58gj_c5a2dfa4-9885-4f86-9d28-6d2f547d5123/exporter/0.log" Apr 22 15:32:07.831414 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:07.831395 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-g58gj_c5a2dfa4-9885-4f86-9d28-6d2f547d5123/extractor/0.log" Apr 22 15:32:09.453375 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:09.453340 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-5d86bd95b-82mcg_0ba5e2f3-2767-40eb-8524-bdf904ee150e/manager/0.log" Apr 22 15:32:09.477800 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:09.477776 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-jjsvm_672711fe-41ee-42d5-b5be-01cf8ebf3933/jobset-operator/0.log" Apr 22 15:32:11.875464 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:11.875430 2559 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-bvmcd" Apr 22 15:32:12.562705 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:12.562670 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ngnjp_e9b83bb0-5336-43e1-8a72-f561808f0834/migrator/0.log" Apr 22 15:32:12.583721 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:12.583697 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ngnjp_e9b83bb0-5336-43e1-8a72-f561808f0834/graceful-termination/0.log" Apr 22 15:32:13.965595 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:13.965565 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/kube-multus-additional-cni-plugins/0.log" Apr 22 15:32:13.986778 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:13.986751 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/egress-router-binary-copy/0.log" Apr 22 15:32:14.008330 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.008311 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/cni-plugins/0.log" Apr 22 15:32:14.028918 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.028899 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/bond-cni-plugin/0.log" Apr 22 15:32:14.050152 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.050134 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/routeoverride-cni/0.log" Apr 22 15:32:14.077563 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.077536 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/whereabouts-cni-bincopy/0.log" Apr 22 15:32:14.093708 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.093656 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2scbl_89d42380-0740-498a-b66f-bfbe3da83afe/whereabouts-cni/0.log" Apr 22 15:32:14.483859 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.483834 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xsl6l_1ba654ca-0d27-4e4d-b6e5-6f5d49a54cde/kube-multus/0.log" Apr 22 15:32:14.505807 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.505784 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8hzhw_a5671169-29ab-4157-acb0-207c8e83501a/network-metrics-daemon/0.log" Apr 22 15:32:14.525584 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:14.525564 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8hzhw_a5671169-29ab-4157-acb0-207c8e83501a/kube-rbac-proxy/0.log" Apr 22 15:32:15.678709 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.678677 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-controller/0.log" Apr 22 15:32:15.700816 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.700794 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/0.log" Apr 22 15:32:15.707634 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.707614 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovn-acl-logging/1.log" Apr 22 15:32:15.730334 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.730263 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/kube-rbac-proxy-node/0.log" Apr 22 15:32:15.749816 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.749784 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:32:15.767572 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.767550 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/northd/0.log" Apr 22 15:32:15.786550 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.786514 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/nbdb/0.log" Apr 22 15:32:15.805827 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.805804 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/sbdb/0.log" Apr 22 15:32:15.896224 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:15.896202 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4bfdn_34b4944f-8ab6-4c2c-9fe3-0a1b6099a7ee/ovnkube-controller/0.log" Apr 22 15:32:17.196642 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:17.196606 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w22tj_7fff4b63-00a5-4c92-8135-75133481b228/network-check-target-container/0.log" Apr 22 15:32:18.107873 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:18.107845 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pppj5_60e4ccd4-2561-4992-af9d-bf593b0e0685/iptables-alerter/0.log" Apr 22 15:32:18.748943 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:18.748919 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-85mbp_e378e9f4-6a56-4fe5-81f1-3a9d2996d62d/tuned/0.log" Apr 22 15:32:20.516609 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:20.516579 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qm2z5_f963f6da-e36a-4b61-ade2-ceb2efdb3eb2/cluster-samples-operator/0.log" Apr 22 15:32:20.535855 ip-10-0-141-167 kubenswrapper[2559]: I0422 15:32:20.535832 2559 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qm2z5_f963f6da-e36a-4b61-ade2-ceb2efdb3eb2/cluster-samples-operator-watch/0.log"