Apr 17 14:07:05.978685 ip-10-0-140-205 systemd[1]: Starting Kubernetes Kubelet... Apr 17 14:07:06.449038 ip-10-0-140-205 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:06.449038 ip-10-0-140-205 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 14:07:06.449038 ip-10-0-140-205 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:06.449038 ip-10-0-140-205 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 14:07:06.449038 ip-10-0-140-205 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 14:07:06.451757 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.451668 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 14:07:06.454819 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454804 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454821 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454825 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454829 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454832 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454836 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454838 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454841 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454844 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454846 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454849 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454852 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454855 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454858 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454861 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454864 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454866 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:06.454861 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454869 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454876 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454879 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454882 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454884 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454887 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454889 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454892 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454894 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454897 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454899 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454902 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454904 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454907 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454910 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454912 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454914 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454917 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454919 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454922 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:06.455273 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454924 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454927 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454930 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454934 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454937 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454941 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454944 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454954 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454957 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454960 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454963 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454966 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454968 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454971 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454974 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454977 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454980 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454983 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454986 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:06.455806 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454989 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454991 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454994 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454996 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.454999 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455001 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455004 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455006 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455009 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455011 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455014 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455016 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455019 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455022 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455024 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455028 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455030 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455033 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455036 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:06.456271 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455038 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455041 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455043 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455046 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455050 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455052 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455063 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455066 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455069 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455077 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:06.456741 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.455081 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:06.457594 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457582 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:06.457594 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457593 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457596 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457600 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457603 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457606 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457608 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457611 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457614 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457617 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457619 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457622 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457625 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457628 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457631 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457633 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457636 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457638 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457641 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457644 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457646 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:06.457653 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457649 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457652 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457654 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457657 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457660 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457663 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457667 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457671 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457674 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457678 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457681 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457684 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457686 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457690 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457693 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457696 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457698 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457701 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457704 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457707 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:06.458144 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457709 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457712 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457714 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457717 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457719 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457722 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457725 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457727 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457729 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457732 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457734 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457737 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457740 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457743 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457746 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457749 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457751 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457754 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457756 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:06.458656 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457759 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457761 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457764 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457767 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457769 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457772 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457775 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457777 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457779 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457782 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457784 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457787 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457789 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457792 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457794 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457797 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457799 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457801 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457804 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457807 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:06.459169 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457810 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457812 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457815 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457819 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457823 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.457826 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457893 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457901 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457908 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457912 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457917 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457920 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457925 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457929 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457933 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457935 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457940 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457943 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457946 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457949 2578 flags.go:64] FLAG: --cgroup-root="" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457953 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457956 2578 flags.go:64] FLAG: --client-ca-file="" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457959 2578 flags.go:64] FLAG: --cloud-config="" Apr 17 14:07:06.459675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457961 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457964 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457968 2578 flags.go:64] FLAG: --cluster-domain="" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457971 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457975 2578 flags.go:64] FLAG: --config-dir="" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457977 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457981 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457985 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457988 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457991 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457995 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.457998 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458001 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458004 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458007 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458011 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458015 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458018 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458021 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458024 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458028 2578 flags.go:64] FLAG: --enable-server="true" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458031 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458036 2578 flags.go:64] FLAG: --event-burst="100" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458039 2578 flags.go:64] FLAG: --event-qps="50" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458042 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 14:07:06.460309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458045 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458048 2578 flags.go:64] FLAG: --eviction-hard="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458052 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458055 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458058 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458061 2578 flags.go:64] FLAG: --eviction-soft="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458064 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458067 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458070 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458073 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458076 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458079 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458082 2578 flags.go:64] FLAG: --feature-gates="" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458086 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458088 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458094 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458098 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458101 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458104 2578 flags.go:64] FLAG: --help="false" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458108 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458111 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458114 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458117 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458121 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 14:07:06.460923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458124 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458127 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458130 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458133 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458136 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458139 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458143 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458145 2578 flags.go:64] FLAG: --kube-reserved="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458149 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458152 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458156 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458159 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458162 2578 flags.go:64] FLAG: --lock-file="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458165 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458167 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458170 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458176 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458179 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458182 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458185 2578 flags.go:64] FLAG: --logging-format="text" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458187 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458191 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458194 2578 flags.go:64] FLAG: --manifest-url="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458198 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458203 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 14:07:06.461499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458206 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458210 2578 flags.go:64] FLAG: --max-pods="110" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458214 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458217 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458221 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458225 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458228 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458231 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458234 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458241 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458244 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458247 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458250 2578 flags.go:64] FLAG: --pod-cidr="" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458253 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458259 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458262 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458265 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458268 2578 flags.go:64] FLAG: --port="10250" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458272 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458275 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07c3bbe852092e1f8" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458278 2578 flags.go:64] FLAG: --qos-reserved="" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458281 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458284 2578 flags.go:64] FLAG: --register-node="true" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458287 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 17 14:07:06.462133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458290 2578 flags.go:64] FLAG: --register-with-taints="" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458293 2578 flags.go:64] FLAG: --registry-burst="10" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458296 2578 flags.go:64] FLAG: --registry-qps="5" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458299 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458302 2578 flags.go:64] FLAG: --reserved-memory="" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458306 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458311 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458314 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458316 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458319 2578 flags.go:64] FLAG: --runonce="false" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458322 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458326 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458329 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458332 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458335 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458338 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458341 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458344 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458347 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458350 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458354 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458357 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458360 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458362 2578 flags.go:64] FLAG: --system-cgroups="" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458365 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 14:07:06.462723 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458370 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458373 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458377 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458381 2578 flags.go:64] FLAG: --tls-min-version="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458383 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458386 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458389 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458392 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458395 2578 flags.go:64] FLAG: --v="2" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458399 2578 flags.go:64] FLAG: --version="false" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458403 2578 flags.go:64] FLAG: --vmodule="" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458407 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.458411 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458526 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458531 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458534 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458538 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458541 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458543 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458546 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458550 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458553 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458556 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:06.463333 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458558 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458561 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458563 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458566 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458569 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458571 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458574 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458576 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458579 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458581 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458584 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458587 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458589 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458592 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458594 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458597 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458599 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458602 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458605 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458607 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:06.463897 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458609 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458612 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458616 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458618 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458621 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458624 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458626 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458629 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458631 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458635 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458638 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458640 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458643 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458645 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458648 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458651 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458653 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458656 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458659 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458662 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:06.464485 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458664 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458667 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458669 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458672 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458674 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458677 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458680 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458682 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458685 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458687 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458690 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458692 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458694 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458697 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458700 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458703 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458706 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458708 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458711 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458713 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:06.464984 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458716 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458721 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458725 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458729 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458732 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458735 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458739 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458742 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458745 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458747 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458750 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458753 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458755 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458758 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458761 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:06.465749 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.458765 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:06.466320 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.459446 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:06.467113 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.467093 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 14:07:06.467155 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.467114 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467165 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467170 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467173 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467177 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467180 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467183 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467186 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:06.467185 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467189 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467192 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467195 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467197 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467200 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467203 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467205 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467208 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467212 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467214 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467217 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467220 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467222 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467225 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467228 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467231 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467233 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467235 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467239 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467241 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:06.467387 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467243 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467246 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467249 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467252 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467254 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467257 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467260 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467262 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467264 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467267 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467270 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467272 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467275 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467278 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467281 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467284 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467287 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467290 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467292 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467295 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:06.467889 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467298 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467301 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467303 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467306 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467308 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467311 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467313 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467316 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467319 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467322 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467325 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467327 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467330 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467332 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467335 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467337 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467340 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467343 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467345 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467348 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:06.468379 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467350 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467355 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467359 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467363 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467366 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467369 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467373 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467376 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467378 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467388 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467392 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467396 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467399 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467402 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467405 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467407 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467410 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467413 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:06.468923 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467415 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.467420 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467535 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467541 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467544 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467547 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467550 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467553 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467555 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467558 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467561 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467563 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467566 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467569 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467571 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 14:07:06.469396 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467574 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467577 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467579 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467582 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467584 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467587 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467590 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467593 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467595 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467603 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467606 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467609 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467611 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467614 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467617 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467619 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467623 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467627 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467629 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 14:07:06.469779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467632 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467634 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467637 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467639 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467642 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467644 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467647 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467649 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467652 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467654 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467657 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467659 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467662 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467664 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467666 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467669 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467671 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467674 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467676 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467679 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 14:07:06.470253 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467681 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467684 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467688 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467699 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467702 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467705 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467708 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467711 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467714 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467717 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467720 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467723 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467726 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467728 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467731 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467734 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467737 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467739 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467742 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 14:07:06.470779 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467744 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467747 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467750 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467752 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467755 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467757 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467760 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467762 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467765 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467767 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467770 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467772 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467775 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467778 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:06.467780 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.467785 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 14:07:06.471251 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.468484 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 14:07:06.471666 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.470538 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 14:07:06.471666 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.471565 2578 server.go:1019] "Starting client certificate rotation" Apr 17 14:07:06.471666 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.471656 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:07:06.472063 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.472050 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 14:07:06.500791 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.500771 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:07:06.503417 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.503397 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 14:07:06.519137 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.519117 2578 log.go:25] "Validated CRI v1 runtime API" Apr 17 14:07:06.525289 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.525271 2578 log.go:25] "Validated CRI v1 image API" Apr 17 14:07:06.526508 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.526493 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 14:07:06.530013 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.529990 2578 fs.go:135] Filesystem UUIDs: map[640e7988-40ce-434b-a6ab-17933672997b:/dev/nvme0n1p4 6543c7ad-e442-4b04-a19b-342c558a166b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 14:07:06.530096 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.530010 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 14:07:06.530147 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.530101 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:07:06.536027 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.535905 2578 manager.go:217] Machine: {Timestamp:2026-04-17 14:07:06.534531446 +0000 UTC m=+0.432864731 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3145798 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25bb0ef8d801d1ed325f9d4f67aaef SystemUUID:ec25bb0e-f8d8-01d1-ed32-5f9d4f67aaef BootID:9d56e801-8451-46fd-980e-3a8d64084136 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:70:c2:85:2d:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:70:c2:85:2d:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:26:04:42:61:0a:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 14:07:06.536027 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.536015 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 14:07:06.536200 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.536124 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 14:07:06.538115 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.538089 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 14:07:06.538314 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.538118 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-205.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 14:07:06.538956 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.538941 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 14:07:06.539017 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.538963 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 14:07:06.539017 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.538983 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:07:06.539821 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.539791 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 14:07:06.541318 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.541305 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:07:06.541449 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.541421 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 14:07:06.544149 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.544139 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 17 14:07:06.544192 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.544153 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 14:07:06.544192 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.544166 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 14:07:06.544192 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.544175 2578 kubelet.go:397] "Adding apiserver pod source" Apr 17 14:07:06.544192 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.544184 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 14:07:06.545287 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.545273 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:07:06.545336 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.545292 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 14:07:06.548205 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.548190 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 14:07:06.550353 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.550338 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 14:07:06.551765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551751 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551770 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551779 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551787 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551796 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551804 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551813 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551822 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551832 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 14:07:06.551846 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551841 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 14:07:06.552110 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551862 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 14:07:06.552110 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.551876 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 14:07:06.552914 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.552902 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 14:07:06.552968 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.552917 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 14:07:06.556023 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.555993 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 14:07:06.556103 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.556019 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-205.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 14:07:06.556103 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.556001 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 14:07:06.556827 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.556814 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 14:07:06.556875 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.556854 2578 server.go:1295] "Started kubelet" Apr 17 14:07:06.557013 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.556988 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 14:07:06.557099 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.557027 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 14:07:06.557153 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.557141 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 14:07:06.557863 ip-10-0-140-205 systemd[1]: Started Kubernetes Kubelet. Apr 17 14:07:06.558836 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.558723 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 14:07:06.559697 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.559684 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 17 14:07:06.564863 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.564832 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 14:07:06.565504 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.565472 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 14:07:06.565984 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.564760 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-205.ec2.internal.18a72a10138d63c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-205.ec2.internal,UID:ip-10-0-140-205.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-205.ec2.internal,},FirstTimestamp:2026-04-17 14:07:06.556826566 +0000 UTC m=+0.455159851,LastTimestamp:2026-04-17 14:07:06.556826566 +0000 UTC m=+0.455159851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-205.ec2.internal,}" Apr 17 14:07:06.566834 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.566811 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 14:07:06.566926 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.566888 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 14:07:06.566926 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.566906 2578 factory.go:55] Registering systemd factory Apr 17 14:07:06.566926 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.566917 2578 factory.go:223] Registration of the systemd container factory successfully Apr 17 14:07:06.567070 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.566988 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 14:07:06.567070 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567006 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 14:07:06.567070 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567011 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 14:07:06.567196 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.567077 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:06.567196 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567177 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 17 14:07:06.567196 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567184 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 17 14:07:06.567472 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567457 2578 factory.go:153] Registering CRI-O factory Apr 17 14:07:06.567546 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567517 2578 factory.go:223] Registration of the crio container factory successfully Apr 17 14:07:06.567546 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567536 2578 factory.go:103] Registering Raw factory Apr 17 14:07:06.567640 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567549 2578 manager.go:1196] Started watching for new ooms in manager Apr 17 14:07:06.567943 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.567930 2578 manager.go:319] Starting recovery of all containers Apr 17 14:07:06.568396 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.568356 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-205.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 14:07:06.572209 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.572051 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bczfc" Apr 17 14:07:06.578061 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.578038 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bczfc" Apr 17 14:07:06.578236 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.578208 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 14:07:06.579858 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.579841 2578 manager.go:324] Recovery completed Apr 17 14:07:06.583875 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.583860 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.586932 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.586918 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.587003 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.586944 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.587003 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.586957 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.587462 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.587409 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 14:07:06.587462 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.587422 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 14:07:06.587462 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.587453 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 17 14:07:06.589788 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.589776 2578 policy_none.go:49] "None policy: Start" Apr 17 14:07:06.589822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.589792 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 14:07:06.589822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.589802 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 17 14:07:06.640994 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.640975 2578 manager.go:341] "Starting Device Plugin manager" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.641023 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641039 2578 server.go:85] "Starting device plugin registration server" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641257 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641268 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641365 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641452 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.641462 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.641894 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 14:07:06.652383 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.641922 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:06.706989 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.706940 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 14:07:06.708062 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.708047 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 14:07:06.708137 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.708072 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 14:07:06.708137 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.708089 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 14:07:06.708137 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.708095 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 14:07:06.708137 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.708128 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 14:07:06.712188 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.712172 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:06.742107 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.742089 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.743019 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.743006 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.743082 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.743034 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.743082 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.743046 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.743082 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.743075 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.752066 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.752053 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.752119 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.752074 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-205.ec2.internal\": node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:06.768767 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.768743 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:06.808590 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.808572 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal"] Apr 17 14:07:06.808660 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.808631 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.809318 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.809303 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.809389 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.809334 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.809389 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.809348 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.810746 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.810731 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.810892 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.810879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.810942 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.810908 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.811639 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811624 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.811707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811653 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.811707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811669 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.811707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811679 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.811707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811696 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.811707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.811706 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.812701 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.812687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.812755 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.812711 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 14:07:06.813325 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.813309 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientMemory" Apr 17 14:07:06.813396 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.813335 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 14:07:06.813396 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.813344 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeHasSufficientPID" Apr 17 14:07:06.841037 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.841017 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-205.ec2.internal\" not found" node="ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.845558 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.845545 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-205.ec2.internal\" not found" node="ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.868248 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.868229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af0e8e30f34475421e9e054e83fe8eba-config\") pod \"kube-apiserver-proxy-ip-10-0-140-205.ec2.internal\" (UID: \"af0e8e30f34475421e9e054e83fe8eba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.868312 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.868255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.868312 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.868273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.869639 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.869622 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:06.968838 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.968838 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.968838 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af0e8e30f34475421e9e054e83fe8eba-config\") pod \"kube-apiserver-proxy-ip-10-0-140-205.ec2.internal\" (UID: \"af0e8e30f34475421e9e054e83fe8eba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.968974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af0e8e30f34475421e9e054e83fe8eba-config\") pod \"kube-apiserver-proxy-ip-10-0-140-205.ec2.internal\" (UID: \"af0e8e30f34475421e9e054e83fe8eba\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.968974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.968974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:06.968873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8aa172302d742d7d41f0d782f67d392d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal\" (UID: \"8aa172302d742d7d41f0d782f67d392d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:06.969826 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:06.969813 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.070518 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.070483 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.143709 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.143674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:07.147145 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.147124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:07.171235 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.171214 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.271730 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.271636 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.372128 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.372097 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.471497 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.471472 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 14:07:07.472025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.471602 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 14:07:07.472506 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.472491 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.565805 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.565727 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 14:07:07.573412 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:07.573397 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-205.ec2.internal\" not found" Apr 17 14:07:07.575712 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.575697 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 14:07:07.580385 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.580365 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 14:02:06 +0000 UTC" deadline="2027-10-19 15:12:47.604676235 +0000 UTC" Apr 17 14:07:07.580442 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.580385 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13201h5m40.024293335s" Apr 17 14:07:07.593703 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.593690 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:07.601240 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.601219 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7776c" Apr 17 14:07:07.606923 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.606881 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7776c" Apr 17 14:07:07.612600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.612585 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:07.666960 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.666933 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" Apr 17 14:07:07.680070 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.680046 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:07:07.681886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.681866 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" Apr 17 14:07:07.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.688082 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 14:07:07.735881 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.735856 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:07.782241 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:07.782201 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0e8e30f34475421e9e054e83fe8eba.slice/crio-a921ab5b35fbbdcd0e73c8bf10bb7f735b1d0f480c4a4b34dc65f9d875161586 WatchSource:0}: Error finding container a921ab5b35fbbdcd0e73c8bf10bb7f735b1d0f480c4a4b34dc65f9d875161586: Status 404 returned error can't find the container with id a921ab5b35fbbdcd0e73c8bf10bb7f735b1d0f480c4a4b34dc65f9d875161586 Apr 17 14:07:07.786666 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:07.786652 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:07:07.831606 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:07.831571 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aa172302d742d7d41f0d782f67d392d.slice/crio-4e476d65dc6b9a7667f5208965924d0f777fb8b1e0211898730044fd8886df25 WatchSource:0}: Error finding container 4e476d65dc6b9a7667f5208965924d0f777fb8b1e0211898730044fd8886df25: Status 404 returned error can't find the container with id 4e476d65dc6b9a7667f5208965924d0f777fb8b1e0211898730044fd8886df25 Apr 17 14:07:08.545238 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.545204 2578 apiserver.go:52] "Watching apiserver" Apr 17 14:07:08.552583 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.552559 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 14:07:08.552986 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.552960 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-c28zr","openshift-ovn-kubernetes/ovnkube-node-5h9m4","kube-system/konnectivity-agent-4xvtj","kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x","openshift-cluster-node-tuning-operator/tuned-dxxnk","openshift-image-registry/node-ca-4wkj7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal","openshift-multus/multus-additional-cni-plugins-6gr7b","openshift-multus/network-metrics-daemon-7nttt","openshift-dns/node-resolver-vnnct","openshift-multus/multus-wfn9n","openshift-network-diagnostics/network-check-target-spfv9"] Apr 17 14:07:08.554641 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.554622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.554785 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.554711 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:08.556167 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.556146 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.557749 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.557727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.558803 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.558742 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 14:07:08.558803 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.558763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 14:07:08.559218 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.559169 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.559501 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.559382 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560118 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560267 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h5nks\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560319 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560354 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hm9xv\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 14:07:08.560558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.560507 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 14:07:08.561746 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.561467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.563511 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.563486 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.563712 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.563641 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.563712 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.563674 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vjnwm\"" Apr 17 14:07:08.563841 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.563708 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.566019 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.565481 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.566019 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.565624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566369 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566474 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtrxn\"" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566522 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566627 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.566792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.566719 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 14:07:08.567590 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.567570 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.567794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.567779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 14:07:08.568101 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.568087 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.569284 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.569150 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 14:07:08.569500 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.569449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.570751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.569652 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.570751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.570150 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.570751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.570249 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-96h9n\"" Apr 17 14:07:08.570751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.570709 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 14:07:08.570974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.570949 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qqrsd\"" Apr 17 14:07:08.571696 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.571677 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gj4w6\"" Apr 17 14:07:08.572794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.572776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:08.572993 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.572970 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:08.577708 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.577800 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9c6w\" (UniqueName: \"kubernetes.io/projected/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-kube-api-access-q9c6w\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.577800 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-netd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.577906 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-config\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.577906 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-node-log\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578010 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-host\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.578010 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-os-release\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.578010 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-daemon-config\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578010 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.577994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-k8s-cni-cncf-io\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-etc-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tt5\" (UniqueName: \"kubernetes.io/projected/83bdba87-08d8-4d12-9786-7c1ee404890a-kube-api-access-c5tt5\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-kubelet\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-sys\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.578201 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-var-lib-kubelet\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-system-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-bin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578310 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-kubelet\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-systemd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-log-socket\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-cnibin\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cnibin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cni-binary-copy\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.578496 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-netns\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-script-lib\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-kubernetes\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-etc-kubernetes\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7g82\" (UniqueName: \"kubernetes.io/projected/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-kube-api-access-t7g82\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-var-lib-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-iptables-alerter-script\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-socket-dir-parent\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-systemd\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-etc-tuned\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb36c782-6ab6-4f4d-91bc-6792c5debe41-serviceca\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-multus\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-ovn\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-conf\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.578906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb36c782-6ab6-4f4d-91bc-6792c5debe41-host\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-systemd-units\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-slash\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83bdba87-08d8-4d12-9786-7c1ee404890a-ovn-node-metrics-cert\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-modprobe-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-tmp\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2v9\" (UniqueName: \"kubernetes.io/projected/01f1152c-e1c7-4d93-ad8f-877078fb7271-kube-api-access-2z2v9\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-conf-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.579822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.579309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-os-release\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.580266 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.580381 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.580462 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lznj7\" (UniqueName: \"kubernetes.io/projected/fd130f1a-180c-45df-b438-589d809223cc-kube-api-access-lznj7\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.580520 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.580520 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysconfig\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.580621 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-run\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.580621 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76ab9c94-30da-41a0-954f-373b613d462f-konnectivity-ca\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.580621 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.580740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-system-cni-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.580740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-multus-certs\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.580740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glm5v\" (UniqueName: \"kubernetes.io/projected/86e32c2b-962f-40a8-9171-5be908eeee49-kube-api-access-glm5v\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.580740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580703 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-bin\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.580740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-lib-modules\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580752 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7ccvg\"" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580828 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-env-overrides\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-netns\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-hostroot\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76ab9c94-30da-41a0-954f-373b613d462f-agent-certs\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.580973 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxw2n\" (UniqueName: \"kubernetes.io/projected/fb36c782-6ab6-4f4d-91bc-6792c5debe41-kube-api-access-gxw2n\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.580993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-host-slash\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581007 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581025 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581129 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dss5c\"" Apr 17 14:07:08.581324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 14:07:08.581594 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.581335 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 14:07:08.608981 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.608918 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:02:07 +0000 UTC" deadline="2027-11-08 00:25:44.337005777 +0000 UTC" Apr 17 14:07:08.608981 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.608946 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13666h18m35.72806328s" Apr 17 14:07:08.668319 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.668293 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 14:07:08.682227 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-script-lib\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682367 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-kubernetes\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682367 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-kubernetes\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682367 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682336 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-etc-kubernetes\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7g82\" (UniqueName: \"kubernetes.io/projected/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-kube-api-access-t7g82\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-var-lib-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-etc-kubernetes\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-iptables-alerter-script\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-socket-dir-parent\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-systemd\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-etc-tuned\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb36c782-6ab6-4f4d-91bc-6792c5debe41-serviceca\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-multus\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-ovn\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-conf\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-var-lib-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb36c782-6ab6-4f4d-91bc-6792c5debe41-host\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-systemd-units\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-slash\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-conf\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.682874 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb36c782-6ab6-4f4d-91bc-6792c5debe41-host\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-slash\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-socket-dir-parent\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-script-lib\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-multus\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-iptables-alerter-script\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-systemd-units\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-ovn\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.682994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83bdba87-08d8-4d12-9786-7c1ee404890a-ovn-node-metrics-cert\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-systemd\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-modprobe-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683048 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-tmp\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2v9\" (UniqueName: \"kubernetes.io/projected/01f1152c-e1c7-4d93-ad8f-877078fb7271-kube-api-access-2z2v9\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.683372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-modprobe-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.683193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fb36c782-6ab6-4f4d-91bc-6792c5debe41-serviceca\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8411d38d-222c-4fc3-9890-84fb21b47535-tmp-dir\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-conf-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-os-release\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lznj7\" (UniqueName: \"kubernetes.io/projected/fd130f1a-180c-45df-b438-589d809223cc-kube-api-access-lznj7\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-registration-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysconfig\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysconfig\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-run\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76ab9c94-30da-41a0-954f-373b613d462f-konnectivity-ca\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-run\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.686633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-system-cni-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-device-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-etc-sysctl-d\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-os-release\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-multus-certs\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glm5v\" (UniqueName: \"kubernetes.io/projected/86e32c2b-962f-40a8-9171-5be908eeee49-kube-api-access-glm5v\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-conf-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-bin\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-lib-modules\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.684968 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.684971 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.685068 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:09.185020064 +0000 UTC m=+3.083353357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-socket-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-env-overrides\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-netns\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.687372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-binary-copy\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-hostroot\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-hostroot\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/76ab9c94-30da-41a0-954f-373b613d462f-konnectivity-ca\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76ab9c94-30da-41a0-954f-373b613d462f-agent-certs\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxw2n\" (UniqueName: \"kubernetes.io/projected/fb36c782-6ab6-4f4d-91bc-6792c5debe41-kube-api-access-gxw2n\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-multus-certs\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-host-slash\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9c6w\" (UniqueName: \"kubernetes.io/projected/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-kube-api-access-q9c6w\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-netd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-config\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685847 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-sys-fs\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8411d38d-222c-4fc3-9890-84fb21b47535-hosts-file\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.685938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-node-log\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-host\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-os-release\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vtr\" (UniqueName: \"kubernetes.io/projected/d93d961b-181a-4922-b8e6-299ecf0ab597-kube-api-access-p8vtr\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-daemon-config\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-host-slash\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.686957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.687002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-k8s-cni-cncf-io\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.687036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-etc-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.687083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tt5\" (UniqueName: \"kubernetes.io/projected/83bdba87-08d8-4d12-9786-7c1ee404890a-kube-api-access-c5tt5\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.687255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.687504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-daemon-config\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-kubelet\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-netns\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-sys\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-bin\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-var-lib-kubelet\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-node-log\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.688765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-system-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-bin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688672 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-kubelet\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-systemd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-log-socket\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-cnibin\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9nbh\" (UniqueName: \"kubernetes.io/projected/8411d38d-222c-4fc3-9890-84fb21b47535-kube-api-access-f9nbh\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cnibin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-env-overrides\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.688919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cni-binary-copy\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.689498 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-netns\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-run-systemd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-host\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-run-netns\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-kubelet\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.689993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-os-release\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.690036 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-kubelet\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.690359 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-host-cni-netd\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690359 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cnibin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.690359 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-log-socket\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690359 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-sys\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.690359 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690352 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-run-k8s-cni-cncf-io\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-multus-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-cnibin\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/83bdba87-08d8-4d12-9786-7c1ee404890a-etc-openvswitch\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1152c-e1c7-4d93-ad8f-877078fb7271-system-cni-dir\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-lib-modules\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.690605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690077 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1152c-e1c7-4d93-ad8f-877078fb7271-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.690886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-system-cni-dir\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.690886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/83bdba87-08d8-4d12-9786-7c1ee404890a-ovnkube-config\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.690886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.690739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-host-var-lib-cni-bin\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.691043 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.691000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/76ab9c94-30da-41a0-954f-373b613d462f-agent-certs\") pod \"konnectivity-agent-4xvtj\" (UID: \"76ab9c94-30da-41a0-954f-373b613d462f\") " pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.691218 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.691104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-cni-binary-copy\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.691218 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.691172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd130f1a-180c-45df-b438-589d809223cc-var-lib-kubelet\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.691974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.691951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-etc-tuned\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.693017 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.692991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/83bdba87-08d8-4d12-9786-7c1ee404890a-ovn-node-metrics-cert\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.693569 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.693516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7g82\" (UniqueName: \"kubernetes.io/projected/4d8f653c-fe7e-4b15-8b7c-eaa897b23b78-kube-api-access-t7g82\") pod \"multus-wfn9n\" (UID: \"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78\") " pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.693728 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.693688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fd130f1a-180c-45df-b438-589d809223cc-tmp\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.694449 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.694413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2v9\" (UniqueName: \"kubernetes.io/projected/01f1152c-e1c7-4d93-ad8f-877078fb7271-kube-api-access-2z2v9\") pod \"multus-additional-cni-plugins-6gr7b\" (UID: \"01f1152c-e1c7-4d93-ad8f-877078fb7271\") " pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.694651 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.694630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lznj7\" (UniqueName: \"kubernetes.io/projected/fd130f1a-180c-45df-b438-589d809223cc-kube-api-access-lznj7\") pod \"tuned-dxxnk\" (UID: \"fd130f1a-180c-45df-b438-589d809223cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.695930 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.695908 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:08.696024 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.695938 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:08.696024 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.695951 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:08.696024 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:08.696004 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:09.195986882 +0000 UTC m=+3.094320159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:08.697753 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.697733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxw2n\" (UniqueName: \"kubernetes.io/projected/fb36c782-6ab6-4f4d-91bc-6792c5debe41-kube-api-access-gxw2n\") pod \"node-ca-4wkj7\" (UID: \"fb36c782-6ab6-4f4d-91bc-6792c5debe41\") " pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.698555 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.698443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9c6w\" (UniqueName: \"kubernetes.io/projected/53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4-kube-api-access-q9c6w\") pod \"iptables-alerter-c28zr\" (UID: \"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4\") " pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.698555 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.698532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glm5v\" (UniqueName: \"kubernetes.io/projected/86e32c2b-962f-40a8-9171-5be908eeee49-kube-api-access-glm5v\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:08.699629 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.699608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tt5\" (UniqueName: \"kubernetes.io/projected/83bdba87-08d8-4d12-9786-7c1ee404890a-kube-api-access-c5tt5\") pod \"ovnkube-node-5h9m4\" (UID: \"83bdba87-08d8-4d12-9786-7c1ee404890a\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.711864 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.711827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" event={"ID":"8aa172302d742d7d41f0d782f67d392d","Type":"ContainerStarted","Data":"4e476d65dc6b9a7667f5208965924d0f777fb8b1e0211898730044fd8886df25"} Apr 17 14:07:08.712808 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.712784 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" event={"ID":"af0e8e30f34475421e9e054e83fe8eba","Type":"ContainerStarted","Data":"a921ab5b35fbbdcd0e73c8bf10bb7f735b1d0f480c4a4b34dc65f9d875161586"} Apr 17 14:07:08.790970 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.790932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-sys-fs\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.790970 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.790975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8411d38d-222c-4fc3-9890-84fb21b47535-hosts-file\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.791176 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.790999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vtr\" (UniqueName: \"kubernetes.io/projected/d93d961b-181a-4922-b8e6-299ecf0ab597-kube-api-access-p8vtr\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791176 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-sys-fs\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791176 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791176 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8411d38d-222c-4fc3-9890-84fb21b47535-hosts-file\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.791176 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9nbh\" (UniqueName: \"kubernetes.io/projected/8411d38d-222c-4fc3-9890-84fb21b47535-kube-api-access-f9nbh\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-etc-selinux\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8411d38d-222c-4fc3-9890-84fb21b47535-tmp-dir\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-registration-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-device-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-socket-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-registration-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-device-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791825 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d93d961b-181a-4922-b8e6-299ecf0ab597-socket-dir\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.791825 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.791607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8411d38d-222c-4fc3-9890-84fb21b47535-tmp-dir\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.798803 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.798753 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9nbh\" (UniqueName: \"kubernetes.io/projected/8411d38d-222c-4fc3-9890-84fb21b47535-kube-api-access-f9nbh\") pod \"node-resolver-vnnct\" (UID: \"8411d38d-222c-4fc3-9890-84fb21b47535\") " pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:08.798915 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.798833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vtr\" (UniqueName: \"kubernetes.io/projected/d93d961b-181a-4922-b8e6-299ecf0ab597-kube-api-access-p8vtr\") pod \"aws-ebs-csi-driver-node-vzk4x\" (UID: \"d93d961b-181a-4922-b8e6-299ecf0ab597\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.816782 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.816762 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 14:07:08.870328 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.870301 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:08.878178 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.878157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:08.888922 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.888898 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" Apr 17 14:07:08.893628 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.893608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" Apr 17 14:07:08.901416 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.901390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wkj7" Apr 17 14:07:08.909875 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.909849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c28zr" Apr 17 14:07:08.918499 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.918481 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfn9n" Apr 17 14:07:08.926037 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.926018 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" Apr 17 14:07:08.931560 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:08.931539 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vnnct" Apr 17 14:07:09.194329 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.194246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:09.194465 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.194413 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:09.194510 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.194492 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:10.194472847 +0000 UTC m=+4.092806119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:09.294916 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.294875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:09.295102 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.295021 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:09.295102 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.295037 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:09.295102 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.295047 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:09.295230 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:09.295117 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:10.295097878 +0000 UTC m=+4.193431151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:09.522929 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.522855 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8411d38d_222c_4fc3_9890_84fb21b47535.slice/crio-da398fcbefc4c589801e6c6e0ba85c230983cd4d95adce64f0a28d21f12f509a WatchSource:0}: Error finding container da398fcbefc4c589801e6c6e0ba85c230983cd4d95adce64f0a28d21f12f509a: Status 404 returned error can't find the container with id da398fcbefc4c589801e6c6e0ba85c230983cd4d95adce64f0a28d21f12f509a Apr 17 14:07:09.524248 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.524227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f1152c_e1c7_4d93_ad8f_877078fb7271.slice/crio-637d88b99aaa97a0f38ff1731d052c9c79d5a67832c63a8944fa057d61cde2d9 WatchSource:0}: Error finding container 637d88b99aaa97a0f38ff1731d052c9c79d5a67832c63a8944fa057d61cde2d9: Status 404 returned error can't find the container with id 637d88b99aaa97a0f38ff1731d052c9c79d5a67832c63a8944fa057d61cde2d9 Apr 17 14:07:09.527680 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.527656 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76ab9c94_30da_41a0_954f_373b613d462f.slice/crio-d323283a7a6ad00c1c14680c69d5029839a74387356f6c203a2632a1750c4a9a WatchSource:0}: Error finding container d323283a7a6ad00c1c14680c69d5029839a74387356f6c203a2632a1750c4a9a: Status 404 returned error can't find the container with id d323283a7a6ad00c1c14680c69d5029839a74387356f6c203a2632a1750c4a9a Apr 17 14:07:09.528524 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.528502 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd130f1a_180c_45df_b438_589d809223cc.slice/crio-4c6b472768e5bd4ac6ae99289a849ad402784349e8173384d77d3805bf34ace0 WatchSource:0}: Error finding container 4c6b472768e5bd4ac6ae99289a849ad402784349e8173384d77d3805bf34ace0: Status 404 returned error can't find the container with id 4c6b472768e5bd4ac6ae99289a849ad402784349e8173384d77d3805bf34ace0 Apr 17 14:07:09.529249 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.529229 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd93d961b_181a_4922_b8e6_299ecf0ab597.slice/crio-778bc4b4f13e8bca585048a3e12c52cee447a68e2177432741eb11767e0e1493 WatchSource:0}: Error finding container 778bc4b4f13e8bca585048a3e12c52cee447a68e2177432741eb11767e0e1493: Status 404 returned error can't find the container with id 778bc4b4f13e8bca585048a3e12c52cee447a68e2177432741eb11767e0e1493 Apr 17 14:07:09.530064 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.530011 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb36c782_6ab6_4f4d_91bc_6792c5debe41.slice/crio-ba3fe979e7b14b6c4d621e096e8c228929cd96e55454ffe8367b177aa40d1705 WatchSource:0}: Error finding container ba3fe979e7b14b6c4d621e096e8c228929cd96e55454ffe8367b177aa40d1705: Status 404 returned error can't find the container with id ba3fe979e7b14b6c4d621e096e8c228929cd96e55454ffe8367b177aa40d1705 Apr 17 14:07:09.532402 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.532027 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83bdba87_08d8_4d12_9786_7c1ee404890a.slice/crio-d2fdb48219e38ab270d11b1c1cc33c2340581db5e8eb7a7fd1d0eeebd801d805 WatchSource:0}: Error finding container d2fdb48219e38ab270d11b1c1cc33c2340581db5e8eb7a7fd1d0eeebd801d805: Status 404 returned error can't find the container with id d2fdb48219e38ab270d11b1c1cc33c2340581db5e8eb7a7fd1d0eeebd801d805 Apr 17 14:07:09.533688 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.533591 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d8f653c_fe7e_4b15_8b7c_eaa897b23b78.slice/crio-3e3699145e008b4a53a6d71c4ab9c39c81cc306ff5616ea688e85754ebc5795f WatchSource:0}: Error finding container 3e3699145e008b4a53a6d71c4ab9c39c81cc306ff5616ea688e85754ebc5795f: Status 404 returned error can't find the container with id 3e3699145e008b4a53a6d71c4ab9c39c81cc306ff5616ea688e85754ebc5795f Apr 17 14:07:09.535759 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:09.535302 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b66d5d_5930_40dd_8dcf_45ee6f0c9cf4.slice/crio-f18a136f1b1eba4caed6ea1f88dc6cd3a27a5f46ecaf21cb8c4dd283152cf750 WatchSource:0}: Error finding container f18a136f1b1eba4caed6ea1f88dc6cd3a27a5f46ecaf21cb8c4dd283152cf750: Status 404 returned error can't find the container with id f18a136f1b1eba4caed6ea1f88dc6cd3a27a5f46ecaf21cb8c4dd283152cf750 Apr 17 14:07:09.610237 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.610086 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 14:02:07 +0000 UTC" deadline="2027-10-14 15:58:23.734765517 +0000 UTC" Apr 17 14:07:09.610237 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.610232 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13081h51m14.124536686s" Apr 17 14:07:09.715656 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.715621 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"d2fdb48219e38ab270d11b1c1cc33c2340581db5e8eb7a7fd1d0eeebd801d805"} Apr 17 14:07:09.716751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.716727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" event={"ID":"fd130f1a-180c-45df-b438-589d809223cc","Type":"ContainerStarted","Data":"4c6b472768e5bd4ac6ae99289a849ad402784349e8173384d77d3805bf34ace0"} Apr 17 14:07:09.717538 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.717521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4xvtj" event={"ID":"76ab9c94-30da-41a0-954f-373b613d462f","Type":"ContainerStarted","Data":"d323283a7a6ad00c1c14680c69d5029839a74387356f6c203a2632a1750c4a9a"} Apr 17 14:07:09.718902 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.718884 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" event={"ID":"af0e8e30f34475421e9e054e83fe8eba","Type":"ContainerStarted","Data":"0619db47532d6f0c105bf9fe6bd4d88b506d92cb059ec8c238dc3507cdf3fbf4"} Apr 17 14:07:09.719919 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.719900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfn9n" event={"ID":"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78","Type":"ContainerStarted","Data":"3e3699145e008b4a53a6d71c4ab9c39c81cc306ff5616ea688e85754ebc5795f"} Apr 17 14:07:09.720699 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.720680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wkj7" event={"ID":"fb36c782-6ab6-4f4d-91bc-6792c5debe41","Type":"ContainerStarted","Data":"ba3fe979e7b14b6c4d621e096e8c228929cd96e55454ffe8367b177aa40d1705"} Apr 17 14:07:09.721558 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.721536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" event={"ID":"d93d961b-181a-4922-b8e6-299ecf0ab597","Type":"ContainerStarted","Data":"778bc4b4f13e8bca585048a3e12c52cee447a68e2177432741eb11767e0e1493"} Apr 17 14:07:09.722472 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.722453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerStarted","Data":"637d88b99aaa97a0f38ff1731d052c9c79d5a67832c63a8944fa057d61cde2d9"} Apr 17 14:07:09.723414 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.723397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vnnct" event={"ID":"8411d38d-222c-4fc3-9890-84fb21b47535","Type":"ContainerStarted","Data":"da398fcbefc4c589801e6c6e0ba85c230983cd4d95adce64f0a28d21f12f509a"} Apr 17 14:07:09.724298 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.724273 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c28zr" event={"ID":"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4","Type":"ContainerStarted","Data":"f18a136f1b1eba4caed6ea1f88dc6cd3a27a5f46ecaf21cb8c4dd283152cf750"} Apr 17 14:07:09.730911 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:09.730865 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-205.ec2.internal" podStartSLOduration=2.730854769 podStartE2EDuration="2.730854769s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:07:09.730577345 +0000 UTC m=+3.628910639" watchObservedRunningTime="2026-04-17 14:07:09.730854769 +0000 UTC m=+3.629188064" Apr 17 14:07:10.202314 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.202277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:10.202530 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.202456 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:10.202530 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.202530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:12.202510378 +0000 UTC m=+6.100843669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:10.303303 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.303235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:10.303550 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.303446 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:10.303550 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.303465 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:10.303550 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.303479 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:10.303550 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.303538 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:12.303521258 +0000 UTC m=+6.201854544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:10.709698 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.709656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:10.710117 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.709810 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:10.710560 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.710533 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:10.718037 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:10.710636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:10.740004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.739892 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa172302d742d7d41f0d782f67d392d" containerID="2c3890a4b0e463284e508c38013a47699c2c9e5b559c66efae2f340e889b0c2b" exitCode=0 Apr 17 14:07:10.740004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:10.739974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" event={"ID":"8aa172302d742d7d41f0d782f67d392d","Type":"ContainerDied","Data":"2c3890a4b0e463284e508c38013a47699c2c9e5b559c66efae2f340e889b0c2b"} Apr 17 14:07:11.745939 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:11.745900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" event={"ID":"8aa172302d742d7d41f0d782f67d392d","Type":"ContainerStarted","Data":"e321a15f78eb6e37bc09c552fadab701f813fc3f60fc1892eebcb9cab1ecb496"} Apr 17 14:07:11.759261 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:11.759216 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-205.ec2.internal" podStartSLOduration=4.759197452 podStartE2EDuration="4.759197452s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:07:11.758249656 +0000 UTC m=+5.656582951" watchObservedRunningTime="2026-04-17 14:07:11.759197452 +0000 UTC m=+5.657530751" Apr 17 14:07:12.219478 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:12.219361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:12.219642 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.219535 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:12.219642 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.219619 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:16.219595757 +0000 UTC m=+10.117929051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:12.320645 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:12.320606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:12.320809 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.320741 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:12.320809 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.320770 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:12.320809 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.320784 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:12.320969 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.320848 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:16.320831348 +0000 UTC m=+10.219164621 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:12.709235 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:12.709153 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:12.709235 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:12.709195 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:12.709455 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.709289 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:12.709455 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:12.709414 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:14.708327 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:14.708292 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:14.708792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:14.708335 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:14.708792 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:14.708460 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:14.708792 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:14.708521 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:15.753027 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.752989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wkj7" event={"ID":"fb36c782-6ab6-4f4d-91bc-6792c5debe41","Type":"ContainerStarted","Data":"4410476a1b52d133495f0d6423159e9b6afcf6847e4396b02d2ad347cde60f7f"} Apr 17 14:07:15.754475 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.754443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" event={"ID":"d93d961b-181a-4922-b8e6-299ecf0ab597","Type":"ContainerStarted","Data":"f5c5e3f572bfb6a945013ccfdb4c382df027b53c86472376a15b89c730f3608d"} Apr 17 14:07:15.755809 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.755779 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="b95125894b49e7ba3f8ead8343c785fb8ed4020b8107d7dab51e905f5d491868" exitCode=0 Apr 17 14:07:15.755930 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.755854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"b95125894b49e7ba3f8ead8343c785fb8ed4020b8107d7dab51e905f5d491868"} Apr 17 14:07:15.757411 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.757385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vnnct" event={"ID":"8411d38d-222c-4fc3-9890-84fb21b47535","Type":"ContainerStarted","Data":"b8b04287454929ebbffed9797d4a260846cb7a9ae51a6475180926248be661ea"} Apr 17 14:07:15.780488 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.780450 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4wkj7" podStartSLOduration=4.179020535 podStartE2EDuration="9.780438265s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.53490202 +0000 UTC m=+3.433235306" lastFinishedPulling="2026-04-17 14:07:15.136319749 +0000 UTC m=+9.034653036" observedRunningTime="2026-04-17 14:07:15.764540298 +0000 UTC m=+9.662873591" watchObservedRunningTime="2026-04-17 14:07:15.780438265 +0000 UTC m=+9.678771554" Apr 17 14:07:15.794079 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:15.794037 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vnnct" podStartSLOduration=3.180734231 podStartE2EDuration="8.794027596s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.524963032 +0000 UTC m=+3.423296315" lastFinishedPulling="2026-04-17 14:07:15.138256391 +0000 UTC m=+9.036589680" observedRunningTime="2026-04-17 14:07:15.793828106 +0000 UTC m=+9.692161400" watchObservedRunningTime="2026-04-17 14:07:15.794027596 +0000 UTC m=+9.692360891" Apr 17 14:07:16.245364 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.245331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:16.245571 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.245500 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:16.245636 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.245575 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:24.245554437 +0000 UTC m=+18.143887725 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:16.345804 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.345769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:16.345976 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.345924 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:16.345976 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.345949 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:16.345976 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.345965 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:16.346146 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.346032 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:24.346012064 +0000 UTC m=+18.244345351 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:16.538038 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.537815 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 14:07:16.645843 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.645726 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T14:07:16.538018431Z","UUID":"f6ef1245-7719-4c7b-8063-39181c2432b6","Handler":null,"Name":"","Endpoint":""} Apr 17 14:07:16.648167 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.648126 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 14:07:16.648167 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.648165 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 14:07:16.709894 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.709866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:16.710032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.709979 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:16.710087 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.710017 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:16.710087 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:16.710056 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:16.761222 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.761137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" event={"ID":"d93d961b-181a-4922-b8e6-299ecf0ab597","Type":"ContainerStarted","Data":"d813a1484ca1fad29a24ccd63cc7438525241cce1d185a58c5d9fb3379059ad6"} Apr 17 14:07:16.762633 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.762564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c28zr" event={"ID":"53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4","Type":"ContainerStarted","Data":"8e0980d4c2a28968bc55cc1ff1a859272bcde988b43ea9e98ca9360d2a245bfa"} Apr 17 14:07:16.774349 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:16.774292 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c28zr" podStartSLOduration=5.165174643 podStartE2EDuration="10.774278808s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.536626557 +0000 UTC m=+3.434959846" lastFinishedPulling="2026-04-17 14:07:15.145730728 +0000 UTC m=+9.044064011" observedRunningTime="2026-04-17 14:07:16.774005143 +0000 UTC m=+10.672338441" watchObservedRunningTime="2026-04-17 14:07:16.774278808 +0000 UTC m=+10.672612104" Apr 17 14:07:17.768053 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:17.767958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" event={"ID":"d93d961b-181a-4922-b8e6-299ecf0ab597","Type":"ContainerStarted","Data":"9825438bb577da9a680b0bd373b5cd1f56a639c30e397b957af9c625a8401472"} Apr 17 14:07:18.709229 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:18.708608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:18.709229 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:18.708725 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:18.709229 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:18.709086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:18.709229 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:18.709182 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:19.730981 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.730932 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d" Apr 17 14:07:19.731620 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.731184 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:tuned,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d,Command:[/usr/bin/cluster-node-tuning-operator ocp-tuned --in-cluster -v=0],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OCP_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RESYNC_PERIOD,Value:600,ValueFrom:nil,},EnvVar{Name:RELEASE_VERSION,Value:4.20.19,ValueFrom:nil,},EnvVar{Name:CLUSTER_NODE_TUNED_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-modprobe-d,ReadOnly:false,MountPath:/etc/modprobe.d,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-sysconfig,ReadOnly:false,MountPath:/etc/sysconfig,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-sysctl-d,ReadOnly:true,MountPath:/etc/sysctl.d,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-sysctl-conf,ReadOnly:true,MountPath:/etc/sysctl.conf,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-systemd,ReadOnly:false,MountPath:/etc/systemd,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-tuned,ReadOnly:false,MountPath:/etc/tuned,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sys,ReadOnly:false,MountPath:/sys,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lib-modules,ReadOnly:true,MountPath:/lib/modules,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib-kubelet,ReadOnly:true,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lznj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tuned-dxxnk_openshift-cluster-node-tuning-operator(fd130f1a-180c-45df-b438-589d809223cc): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:07:19.732397 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.732364 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tuned\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" podUID="fd130f1a-180c-45df-b438-589d809223cc" Apr 17 14:07:19.772088 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.772050 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tuned\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8f2123db9953f98da0e43c266e6a8a070bf221533b995f87b7e358cc7498ca6d: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" podUID="fd130f1a-180c-45df-b438-589d809223cc" Apr 17 14:07:19.784999 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:19.784948 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vzk4x" podStartSLOduration=4.866947553 podStartE2EDuration="12.784931371s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.532311353 +0000 UTC m=+3.430644639" lastFinishedPulling="2026-04-17 14:07:17.45029517 +0000 UTC m=+11.348628457" observedRunningTime="2026-04-17 14:07:17.781990117 +0000 UTC m=+11.680323433" watchObservedRunningTime="2026-04-17 14:07:19.784931371 +0000 UTC m=+13.683264667" Apr 17 14:07:19.919705 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.919617 2578 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64" Apr 17 14:07:19.919868 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.919825 2578 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:konnectivity-agent,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64,Command:[/usr/bin/proxy-agent],Args:[--logtostderr=true --ca-cert /etc/konnectivity/ca/ca.crt --agent-cert /etc/konnectivity/agent/tls.crt --agent-key /etc/konnectivity/agent/tls.key --proxy-server-host konnectivity-server-clusters-5c9c73bc-9f92-4c37-b1f5--e13dad96.apps.kflux-prd-es01.1ion.p1.openshiftapps.com --proxy-server-port 443 --health-server-port 2041 --agent-identifiers=default-route=true --keepalive-time 30s --probe-interval 5s --sync-interval 5s --sync-interval-cap 30s --v 3],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:HTTP_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:HTTPS_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:NO_PROXY,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{40 -3} {} 40m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:agent-certs,ReadOnly:false,MountPath:/etc/konnectivity/agent,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:konnectivity-ca,ReadOnly:false,MountPath:/etc/konnectivity/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:readyz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:1,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 2041 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod konnectivity-agent-4xvtj_kube-system(76ab9c94-30da-41a0-954f-373b613d462f): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 14:07:19.921010 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:19.920972 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"konnectivity-agent\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kube-system/konnectivity-agent-4xvtj" podUID="76ab9c94-30da-41a0-954f-373b613d462f" Apr 17 14:07:20.708761 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:20.708729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:20.708963 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:20.708729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:20.708963 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:20.708861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:20.708963 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:20.708929 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:20.773528 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:20.773489 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"konnectivity-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f9b333f47df911dd05a9a10cb8ea988e97d3174490b348b8e46a161d9581d64\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kube-system/konnectivity-agent-4xvtj" podUID="76ab9c94-30da-41a0-954f-373b613d462f" Apr 17 14:07:22.709224 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:22.709136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:22.709224 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:22.709160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:22.709879 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:22.709275 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:22.709879 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:22.709396 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:24.308334 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:24.308297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:24.308767 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.308473 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:24.308767 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.308547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:40.308525884 +0000 UTC m=+34.206859159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:24.409640 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:24.409599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:24.409780 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.409760 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:24.409780 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.409779 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:24.409864 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.409791 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:24.409864 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.409845 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:40.409827903 +0000 UTC m=+34.308161180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:24.708792 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:24.708705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:24.708947 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.708831 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:24.708947 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:24.708897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:24.709036 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:24.709014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:26.709990 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:26.709952 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:26.710497 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:26.710064 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:26.710497 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:26.710104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:26.710497 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:26.710233 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:28.708708 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:28.708674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:28.709101 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:28.708712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:28.709101 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:28.708821 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:28.709101 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:28.708916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:29.790092 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.789753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"e9415cef21ed225e5fd9afc9498795c665f50e5701c576789a2d891c296de422"} Apr 17 14:07:29.790647 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"ae24c42a76ccd3c4a4b581d0c20d25f00720e7eb8f32711781740e215d413cfd"} Apr 17 14:07:29.790647 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"b423e5d9fed51177dc341746c32df82fc92a285c2e10f1fcd955040226fe0916"} Apr 17 14:07:29.790647 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790132 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"c91d0a59aed30bcbe03b04e8e25c86c3fef4844de77097ed7be3b77b2a5c6b8a"} Apr 17 14:07:29.790647 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"e42fbe4b23536847c9958e9493796a11d1b51280d2650e5a622874367f068940"} Apr 17 14:07:29.790647 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790155 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"d0837a26cfac80e8a5a14538efcba42786dce442d17871bca8bf946056bbaea8"} Apr 17 14:07:29.790897 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.790874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfn9n" event={"ID":"4d8f653c-fe7e-4b15-8b7c-eaa897b23b78","Type":"ContainerStarted","Data":"4f3d3868433dacfe7efb4ddd5aea2884b45c17dee7561303eed6ccae6f26f32b"} Apr 17 14:07:29.792314 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.792292 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="6a1ad8a44f44d4a1357e45e6f42c7021ca2750ba1522a3803deaebc1e1e7f3fb" exitCode=0 Apr 17 14:07:29.792413 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.792323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"6a1ad8a44f44d4a1357e45e6f42c7021ca2750ba1522a3803deaebc1e1e7f3fb"} Apr 17 14:07:29.817974 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:29.817938 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wfn9n" podStartSLOduration=4.133822757 podStartE2EDuration="23.817928382s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.536349759 +0000 UTC m=+3.434683034" lastFinishedPulling="2026-04-17 14:07:29.220455372 +0000 UTC m=+23.118788659" observedRunningTime="2026-04-17 14:07:29.817690624 +0000 UTC m=+23.716023918" watchObservedRunningTime="2026-04-17 14:07:29.817928382 +0000 UTC m=+23.716261827" Apr 17 14:07:30.709296 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:30.709270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:30.709394 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:30.709270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:30.709458 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:30.709405 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:30.709458 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:30.709450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:30.795992 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:30.795965 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="0e34a69850de64815bbc092aca411d4de8ad4f5d215ddb6add7be53b7c2ac4d6" exitCode=0 Apr 17 14:07:30.796410 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:30.796045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"0e34a69850de64815bbc092aca411d4de8ad4f5d215ddb6add7be53b7c2ac4d6"} Apr 17 14:07:31.424503 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.424473 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bdklc"] Apr 17 14:07:31.427280 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.427254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.427411 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:31.427325 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:31.563126 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.563101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-dbus\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.563245 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.563138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-kubelet-config\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.563245 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.563190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.663686 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.663665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-dbus\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.663757 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.663695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-kubelet-config\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.663757 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.663732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.663819 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.663788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-kubelet-config\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.663860 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:31.663848 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:31.663905 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:31.663897 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret podName:561daaa7-247f-4820-939b-375dd242bbf3 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:32.163881057 +0000 UTC m=+26.062214330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret") pod "global-pull-secret-syncer-bdklc" (UID: "561daaa7-247f-4820-939b-375dd242bbf3") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:31.663969 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.663947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/561daaa7-247f-4820-939b-375dd242bbf3-dbus\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:31.800616 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.800589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"e34b9c468232004c44b24ccf90f402e285fa5d00abf08bb0ca63c0b5c5957a82"} Apr 17 14:07:31.802550 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.802528 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="800332e88d745b684be2fdeca1d7bc09919f151e66e56ad8afbeba5a4d7c4f83" exitCode=0 Apr 17 14:07:31.802654 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:31.802573 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"800332e88d745b684be2fdeca1d7bc09919f151e66e56ad8afbeba5a4d7c4f83"} Apr 17 14:07:32.169253 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:32.169176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:32.169379 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:32.169287 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:32.169379 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:32.169331 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret podName:561daaa7-247f-4820-939b-375dd242bbf3 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:33.16931962 +0000 UTC m=+27.067652893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret") pod "global-pull-secret-syncer-bdklc" (UID: "561daaa7-247f-4820-939b-375dd242bbf3") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:32.709006 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:32.708977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:32.709191 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:32.709089 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:32.709191 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:32.709097 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:32.709325 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:32.709198 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:32.709325 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:32.709245 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:32.709399 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:32.709322 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:33.177317 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:33.177237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:33.177713 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:33.177392 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:33.177713 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:33.177483 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret podName:561daaa7-247f-4820-939b-375dd242bbf3 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:35.177459016 +0000 UTC m=+29.075792302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret") pod "global-pull-secret-syncer-bdklc" (UID: "561daaa7-247f-4820-939b-375dd242bbf3") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:34.708673 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.708505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:34.708673 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.708536 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:34.708673 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:34.708633 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:34.709577 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:34.708745 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:34.709577 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.708793 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:34.709577 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:34.708861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:34.812886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.812853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" event={"ID":"83bdba87-08d8-4d12-9786-7c1ee404890a","Type":"ContainerStarted","Data":"6238baf39fdf714ea95d5a9d1fb7f1b6e40c85ffd6bc0ce8cfa698222bdee586"} Apr 17 14:07:34.813637 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.813483 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:34.813637 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.813591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:34.832693 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.832498 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:34.833952 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.833920 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:34.838718 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:34.838680 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" podStartSLOduration=9.133041923 podStartE2EDuration="28.838666117s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.535936982 +0000 UTC m=+3.434270254" lastFinishedPulling="2026-04-17 14:07:29.241561176 +0000 UTC m=+23.139894448" observedRunningTime="2026-04-17 14:07:34.836839623 +0000 UTC m=+28.735172929" watchObservedRunningTime="2026-04-17 14:07:34.838666117 +0000 UTC m=+28.736999414" Apr 17 14:07:35.194541 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:35.194472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:35.194677 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:35.194604 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:35.194677 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:35.194668 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret podName:561daaa7-247f-4820-939b-375dd242bbf3 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:39.194648977 +0000 UTC m=+33.092982258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret") pod "global-pull-secret-syncer-bdklc" (UID: "561daaa7-247f-4820-939b-375dd242bbf3") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:35.815167 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:35.815129 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:07:36.033742 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.033447 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bdklc"] Apr 17 14:07:36.033742 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.033567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:36.033742 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:36.033672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:36.040454 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.039872 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spfv9"] Apr 17 14:07:36.040454 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.040023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:36.040454 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:36.040186 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:36.040454 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.040399 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7nttt"] Apr 17 14:07:36.040741 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.040563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:36.041820 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:36.040829 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:36.817552 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:36.817523 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 14:07:37.228871 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:37.228799 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:07:37.709333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:37.709295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:37.709333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:37.709322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:37.709575 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:37.709409 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:37.709575 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:37.709476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:37.709685 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:37.709584 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:37.709685 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:37.709652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:39.228350 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:39.228316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:39.228717 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:39.228448 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:39.228717 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:39.228494 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret podName:561daaa7-247f-4820-939b-375dd242bbf3 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:47.228481725 +0000 UTC m=+41.126814997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret") pod "global-pull-secret-syncer-bdklc" (UID: "561daaa7-247f-4820-939b-375dd242bbf3") : object "kube-system"/"original-pull-secret" not registered Apr 17 14:07:39.709045 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:39.709018 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:39.709045 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:39.709040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:39.709214 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:39.709040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:39.709214 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:39.709107 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:39.709214 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:39.709168 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:39.709307 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:39.709254 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:40.336920 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.336763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:40.337357 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.336864 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:40.337357 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.337011 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:12.336996633 +0000 UTC m=+66.235329905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 14:07:40.437760 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.437735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:40.437869 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.437844 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 14:07:40.437869 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.437859 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 14:07:40.437869 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.437868 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kvpgl for pod openshift-network-diagnostics/network-check-target-spfv9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:40.437969 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:40.437912 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl podName:4a41c39c-5f5e-4ad7-8c82-dac16f59266d nodeName:}" failed. No retries permitted until 2026-04-17 14:08:12.43790089 +0000 UTC m=+66.336234162 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kvpgl" (UniqueName: "kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl") pod "network-check-target-spfv9" (UID: "4a41c39c-5f5e-4ad7-8c82-dac16f59266d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 14:07:40.826334 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.826304 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="885c388e2d7d57b5ea87b9c7754f853ab7a6564db40e0d2957f6d7ff11336008" exitCode=0 Apr 17 14:07:40.826555 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.826368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"885c388e2d7d57b5ea87b9c7754f853ab7a6564db40e0d2957f6d7ff11336008"} Apr 17 14:07:40.827746 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.827717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" event={"ID":"fd130f1a-180c-45df-b438-589d809223cc","Type":"ContainerStarted","Data":"8c7abd4ef6671b826327618386f775845d91105c0cd68eb0b1cac827b3be6717"} Apr 17 14:07:40.829092 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.829065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4xvtj" event={"ID":"76ab9c94-30da-41a0-954f-373b613d462f","Type":"ContainerStarted","Data":"d93ec896df2d9abfac17c94e3e0697da61279cda5563fe041d75ab933359727f"} Apr 17 14:07:40.882830 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.882773 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4xvtj" podStartSLOduration=4.281505623 podStartE2EDuration="34.882756744s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.529062946 +0000 UTC m=+3.427396217" lastFinishedPulling="2026-04-17 14:07:40.130314053 +0000 UTC m=+34.028647338" observedRunningTime="2026-04-17 14:07:40.867619208 +0000 UTC m=+34.765952501" watchObservedRunningTime="2026-04-17 14:07:40.882756744 +0000 UTC m=+34.781090040" Apr 17 14:07:40.883033 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:40.882996 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dxxnk" podStartSLOduration=4.312390437 podStartE2EDuration="34.88298642s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.530897746 +0000 UTC m=+3.429231032" lastFinishedPulling="2026-04-17 14:07:40.101493743 +0000 UTC m=+33.999827015" observedRunningTime="2026-04-17 14:07:40.882835036 +0000 UTC m=+34.781168331" watchObservedRunningTime="2026-04-17 14:07:40.88298642 +0000 UTC m=+34.781319716" Apr 17 14:07:41.708455 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.708406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:41.708930 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.708406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:41.708930 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.708541 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:41.708930 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:41.708538 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-spfv9" podUID="4a41c39c-5f5e-4ad7-8c82-dac16f59266d" Apr 17 14:07:41.708930 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:41.708612 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bdklc" podUID="561daaa7-247f-4820-939b-375dd242bbf3" Apr 17 14:07:41.708930 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:41.708670 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:07:41.833442 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.833393 2578 generic.go:358] "Generic (PLEG): container finished" podID="01f1152c-e1c7-4d93-ad8f-877078fb7271" containerID="80703484950b3a4273a32c7b27966985bfbb4a07eb350891a21f2b033db660e2" exitCode=0 Apr 17 14:07:41.833442 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.833446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerDied","Data":"80703484950b3a4273a32c7b27966985bfbb4a07eb350891a21f2b033db660e2"} Apr 17 14:07:41.929278 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.929205 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-205.ec2.internal" event="NodeReady" Apr 17 14:07:41.929405 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.929357 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 14:07:41.961707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.961676 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt"] Apr 17 14:07:41.986921 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.986895 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:07:41.987073 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.987053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:41.990212 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.990188 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-rq8wc\"" Apr 17 14:07:41.990845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.990396 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 14:07:41.990845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.990573 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 14:07:41.990845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.990658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 14:07:41.990845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:41.990703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 14:07:42.007382 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.007363 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk"] Apr 17 14:07:42.007513 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.007497 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.010165 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.010147 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 14:07:42.010294 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.010202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 14:07:42.010412 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.010398 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cs8cq\"" Apr 17 14:07:42.010489 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.010423 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 14:07:42.017810 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.017779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 14:07:42.028553 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.028526 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8"] Apr 17 14:07:42.028657 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.028630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.031257 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.031239 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 14:07:42.031416 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.031306 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 14:07:42.031667 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.031639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 14:07:42.031763 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.031640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 14:07:42.044800 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.044781 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg"] Apr 17 14:07:42.044927 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.044914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.047174 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.047158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dvsh7\"" Apr 17 14:07:42.047257 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.047202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 14:07:42.047257 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.047213 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 14:07:42.062642 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.062615 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt"] Apr 17 14:07:42.062724 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.062647 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4cg5l"] Apr 17 14:07:42.062770 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.062748 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.065098 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.065084 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 14:07:42.083612 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083593 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8"] Apr 17 14:07:42.083696 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg"] Apr 17 14:07:42.083696 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk"] Apr 17 14:07:42.083696 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:07:42.083696 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083665 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cb9x6"] Apr 17 14:07:42.083815 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.083735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.086147 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.086132 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 14:07:42.086238 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.086153 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 14:07:42.086238 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.086195 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fnmsv\"" Apr 17 14:07:42.113180 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.113157 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4cg5l"] Apr 17 14:07:42.113180 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.113185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cb9x6"] Apr 17 14:07:42.113320 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.113288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.116035 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.116016 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 14:07:42.116035 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.116041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nsnr8\"" Apr 17 14:07:42.116214 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.116070 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 14:07:42.116214 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.116140 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 14:07:42.149810 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.149810 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.149941 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4r9l\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.149941 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.149941 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfc1e894-82f8-4269-9f75-b83666e2e572-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.150032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2j2s\" (UniqueName: \"kubernetes.io/projected/7a43d970-3485-47bf-8573-7539f592977a-kube-api-access-z2j2s\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.150032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.149976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.150032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.150032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.150182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.150182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.150182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deebcc5e-3050-4882-a4ed-99aab7e0f695-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.150182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7a43d970-3485-47bf-8573-7539f592977a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150214 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7003ffeb-03c7-4dc4-8b72-73c896410c2c-tmp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150248 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8mcp\" (UniqueName: \"kubernetes.io/projected/7003ffeb-03c7-4dc4-8b72-73c896410c2c-kube-api-access-b8mcp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7n9\" (UniqueName: \"kubernetes.io/projected/dfc1e894-82f8-4269-9f75-b83666e2e572-kube-api-access-sv7n9\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.150324 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.150535 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7003ffeb-03c7-4dc4-8b72-73c896410c2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.150535 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.150365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.251619 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7n9\" (UniqueName: \"kubernetes.io/projected/dfc1e894-82f8-4269-9f75-b83666e2e572-kube-api-access-sv7n9\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.251794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/453f3789-5df6-45c5-bf09-cad968b4e751-tmp-dir\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.251794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.251794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7003ffeb-03c7-4dc4-8b72-73c896410c2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.251794 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.251802 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrq9m\" (UniqueName: \"kubernetes.io/projected/453f3789-5df6-45c5-bf09-cad968b4e751-kube-api-access-lrq9m\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.251856 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.751839032 +0000 UTC m=+36.650172310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/453f3789-5df6-45c5-bf09-cad968b4e751-config-volume\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2j2s\" (UniqueName: \"kubernetes.io/projected/7a43d970-3485-47bf-8573-7539f592977a-kube-api-access-z2j2s\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.251979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252004 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8mcp\" (UniqueName: \"kubernetes.io/projected/7003ffeb-03c7-4dc4-8b72-73c896410c2c-kube-api-access-b8mcp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.252082 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.252102 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vm5\" (UniqueName: \"kubernetes.io/projected/63e2cd45-279f-4653-a7e2-d58678593e5d-kube-api-access-w7vm5\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.252148 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.752131712 +0000 UTC m=+36.650464983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7003ffeb-03c7-4dc4-8b72-73c896410c2c-tmp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4r9l\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfc1e894-82f8-4269-9f75-b83666e2e572-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deebcc5e-3050-4882-a4ed-99aab7e0f695-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.252937 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7a43d970-3485-47bf-8573-7539f592977a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.252625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7003ffeb-03c7-4dc4-8b72-73c896410c2c-tmp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.253103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.254025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.253921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7a43d970-3485-47bf-8573-7539f592977a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.255109 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.254982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.256533 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.256499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.256644 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.256548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.256644 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.256582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.256644 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.256591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-ca\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.256751 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.256696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7003ffeb-03c7-4dc4-8b72-73c896410c2c-klusterlet-config\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.257085 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.257060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.257787 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.257771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7a43d970-3485-47bf-8573-7539f592977a-hub\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.260701 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.260670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/deebcc5e-3050-4882-a4ed-99aab7e0f695-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.260820 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.260803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4r9l\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.260932 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.260913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.261801 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.261775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2j2s\" (UniqueName: \"kubernetes.io/projected/7a43d970-3485-47bf-8573-7539f592977a-kube-api-access-z2j2s\") pod \"cluster-proxy-proxy-agent-86cd74555b-z4hvk\" (UID: \"7a43d970-3485-47bf-8573-7539f592977a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.261878 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.261803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8mcp\" (UniqueName: \"kubernetes.io/projected/7003ffeb-03c7-4dc4-8b72-73c896410c2c-kube-api-access-b8mcp\") pod \"klusterlet-addon-workmgr-8998f4874-qrftg\" (UID: \"7003ffeb-03c7-4dc4-8b72-73c896410c2c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.266735 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.266715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7n9\" (UniqueName: \"kubernetes.io/projected/dfc1e894-82f8-4269-9f75-b83666e2e572-kube-api-access-sv7n9\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.268062 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.268043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfc1e894-82f8-4269-9f75-b83666e2e572-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698df7f869-z6gnt\" (UID: \"dfc1e894-82f8-4269-9f75-b83666e2e572\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.308494 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.308465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" Apr 17 14:07:42.336403 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.336376 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:07:42.353199 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.353312 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.353312 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/453f3789-5df6-45c5-bf09-cad968b4e751-tmp-dir\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.353451 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrq9m\" (UniqueName: \"kubernetes.io/projected/453f3789-5df6-45c5-bf09-cad968b4e751-kube-api-access-lrq9m\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.353451 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/453f3789-5df6-45c5-bf09-cad968b4e751-config-volume\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.353451 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.353362 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:42.353451 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vm5\" (UniqueName: \"kubernetes.io/projected/63e2cd45-279f-4653-a7e2-d58678593e5d-kube-api-access-w7vm5\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.353647 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.353468 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.853445352 +0000 UTC m=+36.751778640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:07:42.353647 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.353619 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:42.353789 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.353690 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:42.85367151 +0000 UTC m=+36.752004789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:42.353789 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.353761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/453f3789-5df6-45c5-bf09-cad968b4e751-tmp-dir\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.354050 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.354026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/453f3789-5df6-45c5-bf09-cad968b4e751-config-volume\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.361406 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.361386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrq9m\" (UniqueName: \"kubernetes.io/projected/453f3789-5df6-45c5-bf09-cad968b4e751-kube-api-access-lrq9m\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.361506 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.361496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vm5\" (UniqueName: \"kubernetes.io/projected/63e2cd45-279f-4653-a7e2-d58678593e5d-kube-api-access-w7vm5\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.371852 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.371514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:42.467464 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.467394 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt"] Apr 17 14:07:42.482712 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.482672 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk"] Apr 17 14:07:42.485747 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:42.485725 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a43d970_3485_47bf_8573_7539f592977a.slice/crio-46ccac4ca71e25e1ce2ab1fdb416518e71031c3254d0dfe5929b0351a583fa59 WatchSource:0}: Error finding container 46ccac4ca71e25e1ce2ab1fdb416518e71031c3254d0dfe5929b0351a583fa59: Status 404 returned error can't find the container with id 46ccac4ca71e25e1ce2ab1fdb416518e71031c3254d0dfe5929b0351a583fa59 Apr 17 14:07:42.517519 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.517394 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg"] Apr 17 14:07:42.520510 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:42.520489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7003ffeb_03c7_4dc4_8b72_73c896410c2c.slice/crio-8a8759da09d9291ac87568733f1fc01ddde4efbb4a6c642f21925de0f679b13e WatchSource:0}: Error finding container 8a8759da09d9291ac87568733f1fc01ddde4efbb4a6c642f21925de0f679b13e: Status 404 returned error can't find the container with id 8a8759da09d9291ac87568733f1fc01ddde4efbb4a6c642f21925de0f679b13e Apr 17 14:07:42.755535 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.755494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.755542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.755640 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.755656 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.755673 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.755701 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:43.755685932 +0000 UTC m=+37.654019204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:42.756035 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.755732 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:43.755716329 +0000 UTC m=+37.654049601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:42.839308 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.839237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" event={"ID":"01f1152c-e1c7-4d93-ad8f-877078fb7271","Type":"ContainerStarted","Data":"e5407eb97d2fa91b7519022d229a5a96b1c33ff5411dd03267cb8680700255f6"} Apr 17 14:07:42.840256 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.840226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" event={"ID":"7003ffeb-03c7-4dc4-8b72-73c896410c2c","Type":"ContainerStarted","Data":"8a8759da09d9291ac87568733f1fc01ddde4efbb4a6c642f21925de0f679b13e"} Apr 17 14:07:42.841133 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.841114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerStarted","Data":"46ccac4ca71e25e1ce2ab1fdb416518e71031c3254d0dfe5929b0351a583fa59"} Apr 17 14:07:42.842012 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.841993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" event={"ID":"dfc1e894-82f8-4269-9f75-b83666e2e572","Type":"ContainerStarted","Data":"501ab52cf475f72056211b6ace9c5a557b3c96c0153086061ed04de88cd86ecb"} Apr 17 14:07:42.856030 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.856003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:42.856122 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.856070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:42.856170 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.856146 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:42.856170 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.856161 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:42.856250 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.856199 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:43.856186804 +0000 UTC m=+37.754520076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:07:42.856250 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:42.856214 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:43.856208531 +0000 UTC m=+37.754541802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:42.861040 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.861005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6gr7b" podStartSLOduration=6.260979681 podStartE2EDuration="36.860993632s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:07:09.526567868 +0000 UTC m=+3.424901143" lastFinishedPulling="2026-04-17 14:07:40.126581818 +0000 UTC m=+34.024915094" observedRunningTime="2026-04-17 14:07:42.859320053 +0000 UTC m=+36.757653346" watchObservedRunningTime="2026-04-17 14:07:42.860993632 +0000 UTC m=+36.759326925" Apr 17 14:07:42.991603 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.991570 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:42.992124 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:42.992106 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:43.709886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.708625 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:07:43.709886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.709105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:07:43.709886 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.709618 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:43.712109 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.711941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:07:43.712724 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.712609 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:07:43.714761 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.714552 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:07:43.714761 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.714622 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bfspf\"" Apr 17 14:07:43.714907 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.714864 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 14:07:43.715066 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.715050 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x9vkl\"" Apr 17 14:07:43.765822 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.765791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:43.766243 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.765863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:43.766243 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.766007 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:43.766243 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.766023 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:43.766243 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.766074 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:45.766057508 +0000 UTC m=+39.664390788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:43.766475 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.766437 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:43.766540 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.766496 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:45.766479715 +0000 UTC m=+39.664812992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:43.846491 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.846089 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:43.847018 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.846859 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4xvtj" Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.867074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:43.867158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.867821 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.867878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:45.867857034 +0000 UTC m=+39.766190310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.867943 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:43.868281 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:43.867978 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:45.867966714 +0000 UTC m=+39.766299986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:45.785951 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:45.785915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:45.785964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.786106 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.786123 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.786138 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.786181 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:49.786162896 +0000 UTC m=+43.684496171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:45.786339 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.786197 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:49.786189143 +0000 UTC m=+43.684522428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:45.887174 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:45.887138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:45.887341 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:45.887220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:45.887341 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.887295 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:45.887483 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.887355 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:49.887337512 +0000 UTC m=+43.785670798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:07:45.887483 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.887371 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:45.887483 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:45.887404 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:49.887393881 +0000 UTC m=+43.785727154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:47.298753 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:47.298715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:47.302710 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:47.302679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/561daaa7-247f-4820-939b-375dd242bbf3-original-pull-secret\") pod \"global-pull-secret-syncer-bdklc\" (UID: \"561daaa7-247f-4820-939b-375dd242bbf3\") " pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:47.345006 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:47.344969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bdklc" Apr 17 14:07:48.984169 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:48.983487 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bdklc"] Apr 17 14:07:48.988523 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:07:48.988493 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod561daaa7_247f_4820_939b_375dd242bbf3.slice/crio-184027fcdc78f0c220004440ba0ba529b82eb7f14b7519007a833d1c5170762b WatchSource:0}: Error finding container 184027fcdc78f0c220004440ba0ba529b82eb7f14b7519007a833d1c5170762b: Status 404 returned error can't find the container with id 184027fcdc78f0c220004440ba0ba529b82eb7f14b7519007a833d1c5170762b Apr 17 14:07:49.817798 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.817750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:49.817798 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.817801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:49.818045 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.817909 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:49.818045 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.817925 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:49.818045 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.817980 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:57.817962092 +0000 UTC m=+51.716295379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:49.818045 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.817911 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:49.818045 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.818044 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:57.818026633 +0000 UTC m=+51.716359910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:49.857182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.857146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" event={"ID":"7003ffeb-03c7-4dc4-8b72-73c896410c2c","Type":"ContainerStarted","Data":"6c22c3e84d02c3856e0bdd99de5d4fea0ff35325d1f2413414ee89ccd00af8cf"} Apr 17 14:07:49.857827 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.857804 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:49.859444 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.859401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerStarted","Data":"1be124f5931258d260e444d2bbba810e358dd9da1dd1c43a77fc6e13385768c4"} Apr 17 14:07:49.859623 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.859598 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:07:49.861131 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.861101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" event={"ID":"dfc1e894-82f8-4269-9f75-b83666e2e572","Type":"ContainerStarted","Data":"f2c56fcddfa20fd47cca1e1b587b8f7a4caf313a41f234e3c5a60ca84eca7927"} Apr 17 14:07:49.862906 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.862881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bdklc" event={"ID":"561daaa7-247f-4820-939b-375dd242bbf3","Type":"ContainerStarted","Data":"184027fcdc78f0c220004440ba0ba529b82eb7f14b7519007a833d1c5170762b"} Apr 17 14:07:49.873564 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.873500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" podStartSLOduration=13.509561056 podStartE2EDuration="19.873486327s" podCreationTimestamp="2026-04-17 14:07:30 +0000 UTC" firstStartedPulling="2026-04-17 14:07:42.522049745 +0000 UTC m=+36.420383016" lastFinishedPulling="2026-04-17 14:07:48.885974996 +0000 UTC m=+42.784308287" observedRunningTime="2026-04-17 14:07:49.872908291 +0000 UTC m=+43.771241586" watchObservedRunningTime="2026-04-17 14:07:49.873486327 +0000 UTC m=+43.771819620" Apr 17 14:07:49.891366 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.891323 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" podStartSLOduration=13.498789695 podStartE2EDuration="19.891311765s" podCreationTimestamp="2026-04-17 14:07:30 +0000 UTC" firstStartedPulling="2026-04-17 14:07:42.478070346 +0000 UTC m=+36.376403617" lastFinishedPulling="2026-04-17 14:07:48.870592411 +0000 UTC m=+42.768925687" observedRunningTime="2026-04-17 14:07:49.890531313 +0000 UTC m=+43.788864609" watchObservedRunningTime="2026-04-17 14:07:49.891311765 +0000 UTC m=+43.789645059" Apr 17 14:07:49.919358 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.919328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:49.919504 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:49.919450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:49.919576 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.919502 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:49.919576 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.919555 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:49.919576 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.919567 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:07:57.919546179 +0000 UTC m=+51.817879452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:49.919727 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:49.919599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:07:57.919584266 +0000 UTC m=+51.817917545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:07:51.868970 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:51.868939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerStarted","Data":"d08353b3d1151e234ac708c21775bf5233de5220b737c82cc3759f4efc1bab13"} Apr 17 14:07:52.873310 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:52.873268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerStarted","Data":"204fff99ba42f59061610aaa6ffaf34176bd8df1ea165550a2a84803ce631241"} Apr 17 14:07:52.891925 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:52.891876 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" podStartSLOduration=13.649351517 podStartE2EDuration="22.891862838s" podCreationTimestamp="2026-04-17 14:07:30 +0000 UTC" firstStartedPulling="2026-04-17 14:07:42.48757737 +0000 UTC m=+36.385910642" lastFinishedPulling="2026-04-17 14:07:51.73008869 +0000 UTC m=+45.628421963" observedRunningTime="2026-04-17 14:07:52.891001214 +0000 UTC m=+46.789334511" watchObservedRunningTime="2026-04-17 14:07:52.891862838 +0000 UTC m=+46.790196132" Apr 17 14:07:53.876884 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:53.876858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bdklc" event={"ID":"561daaa7-247f-4820-939b-375dd242bbf3","Type":"ContainerStarted","Data":"6ce4ecdb4777710d5fd4c06d9e9655ed5dc063029681e7e319112fbd8b35f7bc"} Apr 17 14:07:53.890637 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:53.890594 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bdklc" podStartSLOduration=18.160189735 podStartE2EDuration="22.890580861s" podCreationTimestamp="2026-04-17 14:07:31 +0000 UTC" firstStartedPulling="2026-04-17 14:07:48.990246709 +0000 UTC m=+42.888579985" lastFinishedPulling="2026-04-17 14:07:53.720637825 +0000 UTC m=+47.618971111" observedRunningTime="2026-04-17 14:07:53.890462651 +0000 UTC m=+47.788795973" watchObservedRunningTime="2026-04-17 14:07:53.890580861 +0000 UTC m=+47.788914154" Apr 17 14:07:57.883966 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:57.883931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:57.883978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.884088 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.884149 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.884131589 +0000 UTC m=+67.782464862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.884147 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.884168 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:07:57.884490 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.884218 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.884203405 +0000 UTC m=+67.782536678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:07:57.985006 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:57.984960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:07:57.985148 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:07:57.985054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:07:57.985148 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.985094 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:07:57.985219 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.985150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.985136531 +0000 UTC m=+67.883469809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:07:57.985219 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.985149 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:07:57.985219 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:07:57.985175 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:08:13.98516786 +0000 UTC m=+67.883501131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:08:07.835169 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:07.835141 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5h9m4" Apr 17 14:08:12.394093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.394058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:08:12.396641 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.396623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 14:08:12.404533 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:12.404517 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:08:12.404584 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:12.404567 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:09:16.404552432 +0000 UTC m=+130.302885704 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : secret "metrics-daemon-secret" not found Apr 17 14:08:12.495311 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.495286 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:08:12.497988 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.497973 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 14:08:12.508071 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.508052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 14:08:12.519183 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.519165 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpgl\" (UniqueName: \"kubernetes.io/projected/4a41c39c-5f5e-4ad7-8c82-dac16f59266d-kube-api-access-kvpgl\") pod \"network-check-target-spfv9\" (UID: \"4a41c39c-5f5e-4ad7-8c82-dac16f59266d\") " pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:08:12.528238 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.528221 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x9vkl\"" Apr 17 14:08:12.536214 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.536202 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:08:12.647511 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.647440 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-spfv9"] Apr 17 14:08:12.650478 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:08:12.650454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a41c39c_5f5e_4ad7_8c82_dac16f59266d.slice/crio-401a15913d00e212406ac27326adf614dde04ff5d5e43b2d50c85afcce079ba2 WatchSource:0}: Error finding container 401a15913d00e212406ac27326adf614dde04ff5d5e43b2d50c85afcce079ba2: Status 404 returned error can't find the container with id 401a15913d00e212406ac27326adf614dde04ff5d5e43b2d50c85afcce079ba2 Apr 17 14:08:12.925409 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:12.925318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spfv9" event={"ID":"4a41c39c-5f5e-4ad7-8c82-dac16f59266d","Type":"ContainerStarted","Data":"401a15913d00e212406ac27326adf614dde04ff5d5e43b2d50c85afcce079ba2"} Apr 17 14:08:13.906380 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:13.906340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:13.906392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:13.906506 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:13.906518 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:13.906532 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:13.906584 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:45.906566479 +0000 UTC m=+99.804899754 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:08:13.906833 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:13.906599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:45.906593008 +0000 UTC m=+99.804926280 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:08:14.006754 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:14.006706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:08:14.006935 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:14.006785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:08:14.006935 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:14.006858 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:14.006935 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:14.006891 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:14.006935 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:14.006923 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:08:46.006907425 +0000 UTC m=+99.905240720 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:08:14.007089 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:14.006942 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:08:46.006929284 +0000 UTC m=+99.905262556 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:08:17.939182 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:17.939146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-spfv9" event={"ID":"4a41c39c-5f5e-4ad7-8c82-dac16f59266d","Type":"ContainerStarted","Data":"728f791341239cea2523618b3503ca6378ea5dfd1dc77b4c3f3db1bc32ea0cbf"} Apr 17 14:08:17.939579 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:17.939288 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:08:17.958002 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:17.957957 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-spfv9" podStartSLOduration=67.230570645 podStartE2EDuration="1m11.957945432s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:08:12.652313464 +0000 UTC m=+66.550646741" lastFinishedPulling="2026-04-17 14:08:17.379688257 +0000 UTC m=+71.278021528" observedRunningTime="2026-04-17 14:08:17.957122425 +0000 UTC m=+71.855455738" watchObservedRunningTime="2026-04-17 14:08:17.957945432 +0000 UTC m=+71.856278726" Apr 17 14:08:45.941420 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:45.941382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:08:45.941420 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:45.941440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:08:45.941878 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:45.941518 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 14:08:45.941878 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:45.941554 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 14:08:45.941878 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:45.941565 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59c9f569-ksngn: secret "image-registry-tls" not found Apr 17 14:08:45.941878 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:45.941591 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert podName:deebcc5e-3050-4882-a4ed-99aab7e0f695 nodeName:}" failed. No retries permitted until 2026-04-17 14:09:49.94157572 +0000 UTC m=+163.839908992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tcvd8" (UID: "deebcc5e-3050-4882-a4ed-99aab7e0f695") : secret "networking-console-plugin-cert" not found Apr 17 14:08:45.941878 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:45.941610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls podName:d1c063ea-7517-4b61-9ad5-045a966d6633 nodeName:}" failed. No retries permitted until 2026-04-17 14:09:49.941597691 +0000 UTC m=+163.839930964 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls") pod "image-registry-6d59c9f569-ksngn" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633") : secret "image-registry-tls" not found Apr 17 14:08:46.042672 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:46.042644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:08:46.042806 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:46.042695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:08:46.042806 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:46.042785 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 14:08:46.042806 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:46.042786 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 14:08:46.042904 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:46.042830 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls podName:453f3789-5df6-45c5-bf09-cad968b4e751 nodeName:}" failed. No retries permitted until 2026-04-17 14:09:50.042817997 +0000 UTC m=+163.941151269 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls") pod "dns-default-4cg5l" (UID: "453f3789-5df6-45c5-bf09-cad968b4e751") : secret "dns-default-metrics-tls" not found Apr 17 14:08:46.042904 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:08:46.042842 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert podName:63e2cd45-279f-4653-a7e2-d58678593e5d nodeName:}" failed. No retries permitted until 2026-04-17 14:09:50.04283672 +0000 UTC m=+163.941169992 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert") pod "ingress-canary-cb9x6" (UID: "63e2cd45-279f-4653-a7e2-d58678593e5d") : secret "canary-serving-cert" not found Apr 17 14:08:48.943955 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:08:48.943925 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-spfv9" Apr 17 14:09:16.460108 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:16.460071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:09:16.460609 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:16.460191 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 14:09:16.460609 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:16.460250 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs podName:86e32c2b-962f-40a8-9171-5be908eeee49 nodeName:}" failed. No retries permitted until 2026-04-17 14:11:18.460235836 +0000 UTC m=+252.358569109 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs") pod "network-metrics-daemon-7nttt" (UID: "86e32c2b-962f-40a8-9171-5be908eeee49") : secret "metrics-daemon-secret" not found Apr 17 14:09:18.751216 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:18.751188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vnnct_8411d38d-222c-4fc3-9890-84fb21b47535/dns-node-resolver/0.log" Apr 17 14:09:19.350705 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:19.350680 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wkj7_fb36c782-6ab6-4f4d-91bc-6792c5debe41/node-ca/0.log" Apr 17 14:09:45.017916 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:45.017874 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" Apr 17 14:09:45.065348 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:45.065310 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" podUID="deebcc5e-3050-4882-a4ed-99aab7e0f695" Apr 17 14:09:45.092137 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:45.092103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4cg5l" podUID="453f3789-5df6-45c5-bf09-cad968b4e751" Apr 17 14:09:45.121298 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:45.121255 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cb9x6" podUID="63e2cd45-279f-4653-a7e2-d58678593e5d" Apr 17 14:09:45.135197 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:45.135161 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:09:45.135320 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:45.135233 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4cg5l" Apr 17 14:09:45.135320 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:45.135253 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:09:46.738372 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:09:46.738334 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7nttt" podUID="86e32c2b-962f-40a8-9171-5be908eeee49" Apr 17 14:09:48.495200 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.495164 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d88zh"] Apr 17 14:09:48.498674 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.498653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.501924 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.501908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 14:09:48.501924 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.501921 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 14:09:48.503294 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.503267 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 14:09:48.503408 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.503384 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cl66l\"" Apr 17 14:09:48.503690 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.503674 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 14:09:48.520333 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.520312 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d88zh"] Apr 17 14:09:48.598876 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.598846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-crio-socket\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.598876 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.598878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.599114 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.598910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.599114 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.598937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d267n\" (UniqueName: \"kubernetes.io/projected/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-api-access-d267n\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.599114 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.599084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-data-volume\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699715 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-crio-socket\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699836 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699836 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699836 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-crio-socket\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699836 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d267n\" (UniqueName: \"kubernetes.io/projected/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-api-access-d267n\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.699977 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.699927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-data-volume\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.700320 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.700303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.700813 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.700795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-data-volume\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.702252 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.702230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.707448 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.707416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d267n\" (UniqueName: \"kubernetes.io/projected/1659be2e-44e8-44dd-aa82-e6c78b31f5e1-kube-api-access-d267n\") pod \"insights-runtime-extractor-d88zh\" (UID: \"1659be2e-44e8-44dd-aa82-e6c78b31f5e1\") " pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.807023 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.806955 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d88zh" Apr 17 14:09:48.915883 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:48.915855 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d88zh"] Apr 17 14:09:48.918936 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:48.918908 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1659be2e_44e8_44dd_aa82_e6c78b31f5e1.slice/crio-3c571f16b3f5bc1d2d0c357a5fd51f69eee2037cad64f64da67eafb9e7c39488 WatchSource:0}: Error finding container 3c571f16b3f5bc1d2d0c357a5fd51f69eee2037cad64f64da67eafb9e7c39488: Status 404 returned error can't find the container with id 3c571f16b3f5bc1d2d0c357a5fd51f69eee2037cad64f64da67eafb9e7c39488 Apr 17 14:09:49.146192 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.146164 2578 generic.go:358] "Generic (PLEG): container finished" podID="7003ffeb-03c7-4dc4-8b72-73c896410c2c" containerID="6c22c3e84d02c3856e0bdd99de5d4fea0ff35325d1f2413414ee89ccd00af8cf" exitCode=1 Apr 17 14:09:49.146287 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.146236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" event={"ID":"7003ffeb-03c7-4dc4-8b72-73c896410c2c","Type":"ContainerDied","Data":"6c22c3e84d02c3856e0bdd99de5d4fea0ff35325d1f2413414ee89ccd00af8cf"} Apr 17 14:09:49.146659 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.146621 2578 scope.go:117] "RemoveContainer" containerID="6c22c3e84d02c3856e0bdd99de5d4fea0ff35325d1f2413414ee89ccd00af8cf" Apr 17 14:09:49.147404 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.147381 2578 generic.go:358] "Generic (PLEG): container finished" podID="dfc1e894-82f8-4269-9f75-b83666e2e572" containerID="f2c56fcddfa20fd47cca1e1b587b8f7a4caf313a41f234e3c5a60ca84eca7927" exitCode=255 Apr 17 14:09:49.147509 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.147474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" event={"ID":"dfc1e894-82f8-4269-9f75-b83666e2e572","Type":"ContainerDied","Data":"f2c56fcddfa20fd47cca1e1b587b8f7a4caf313a41f234e3c5a60ca84eca7927"} Apr 17 14:09:49.147793 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.147771 2578 scope.go:117] "RemoveContainer" containerID="f2c56fcddfa20fd47cca1e1b587b8f7a4caf313a41f234e3c5a60ca84eca7927" Apr 17 14:09:49.148664 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.148632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d88zh" event={"ID":"1659be2e-44e8-44dd-aa82-e6c78b31f5e1","Type":"ContainerStarted","Data":"3a2e2280bb9097b54c6c744630f8be6ced68a1ad513403e23d0d13011daaea98"} Apr 17 14:09:49.148664 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.148658 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d88zh" event={"ID":"1659be2e-44e8-44dd-aa82-e6c78b31f5e1","Type":"ContainerStarted","Data":"3c571f16b3f5bc1d2d0c357a5fd51f69eee2037cad64f64da67eafb9e7c39488"} Apr 17 14:09:49.858295 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:49.858258 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:09:50.011813 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.011771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:09:50.011813 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.011818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:09:50.014032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.014005 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/deebcc5e-3050-4882-a4ed-99aab7e0f695-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tcvd8\" (UID: \"deebcc5e-3050-4882-a4ed-99aab7e0f695\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:09:50.014183 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.014161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"image-registry-6d59c9f569-ksngn\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:09:50.112611 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.112567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:09:50.112768 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.112645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:09:50.114905 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.114880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e2cd45-279f-4653-a7e2-d58678593e5d-cert\") pod \"ingress-canary-cb9x6\" (UID: \"63e2cd45-279f-4653-a7e2-d58678593e5d\") " pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:09:50.115002 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.114902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/453f3789-5df6-45c5-bf09-cad968b4e751-metrics-tls\") pod \"dns-default-4cg5l\" (UID: \"453f3789-5df6-45c5-bf09-cad968b4e751\") " pod="openshift-dns/dns-default-4cg5l" Apr 17 14:09:50.151955 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.151923 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d88zh" event={"ID":"1659be2e-44e8-44dd-aa82-e6c78b31f5e1","Type":"ContainerStarted","Data":"3a7631ba85a57aa9bcf3e7387a3900c71de9b2deeffa08b82e5536e301671961"} Apr 17 14:09:50.153399 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.153370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" event={"ID":"7003ffeb-03c7-4dc4-8b72-73c896410c2c","Type":"ContainerStarted","Data":"6a41226079adbf5ff1e056580b8f041318dc0303270de0c1d0e2a5ad82b39bce"} Apr 17 14:09:50.153687 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.153633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:09:50.154306 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.154283 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8998f4874-qrftg" Apr 17 14:09:50.154972 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.154954 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698df7f869-z6gnt" event={"ID":"dfc1e894-82f8-4269-9f75-b83666e2e572","Type":"ContainerStarted","Data":"07275fec589aabf14c90f961145bfc24b6e9caf427c37a051acd153b99009ad5"} Apr 17 14:09:50.238882 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.238802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dvsh7\"" Apr 17 14:09:50.238980 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.238912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fnmsv\"" Apr 17 14:09:50.238980 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.238912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cs8cq\"" Apr 17 14:09:50.246978 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.246954 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:09:50.246978 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.246971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4cg5l" Apr 17 14:09:50.247149 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.246960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" Apr 17 14:09:50.377078 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.377050 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4cg5l"] Apr 17 14:09:50.379121 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:50.379038 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453f3789_5df6_45c5_bf09_cad968b4e751.slice/crio-f909331b046086819f626d964c60910b020b6ee5a069d8e7323a5fc857f9b807 WatchSource:0}: Error finding container f909331b046086819f626d964c60910b020b6ee5a069d8e7323a5fc857f9b807: Status 404 returned error can't find the container with id f909331b046086819f626d964c60910b020b6ee5a069d8e7323a5fc857f9b807 Apr 17 14:09:50.392309 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.392279 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:09:50.396327 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:50.396301 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c063ea_7517_4b61_9ad5_045a966d6633.slice/crio-4963ba0630d1f45daa18c940693bb4a217ca9c03ab15c1e770ec6434d02d3715 WatchSource:0}: Error finding container 4963ba0630d1f45daa18c940693bb4a217ca9c03ab15c1e770ec6434d02d3715: Status 404 returned error can't find the container with id 4963ba0630d1f45daa18c940693bb4a217ca9c03ab15c1e770ec6434d02d3715 Apr 17 14:09:50.408318 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:50.408295 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8"] Apr 17 14:09:50.411153 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:50.411132 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeebcc5e_3050_4882_a4ed_99aab7e0f695.slice/crio-3a5127a8a9fa99bd197d823edf51ce89764cd4ec48f4195cb6a026e8d8400791 WatchSource:0}: Error finding container 3a5127a8a9fa99bd197d823edf51ce89764cd4ec48f4195cb6a026e8d8400791: Status 404 returned error can't find the container with id 3a5127a8a9fa99bd197d823edf51ce89764cd4ec48f4195cb6a026e8d8400791 Apr 17 14:09:51.159523 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.159487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" event={"ID":"deebcc5e-3050-4882-a4ed-99aab7e0f695","Type":"ContainerStarted","Data":"3a5127a8a9fa99bd197d823edf51ce89764cd4ec48f4195cb6a026e8d8400791"} Apr 17 14:09:51.160713 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.160675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4cg5l" event={"ID":"453f3789-5df6-45c5-bf09-cad968b4e751","Type":"ContainerStarted","Data":"f909331b046086819f626d964c60910b020b6ee5a069d8e7323a5fc857f9b807"} Apr 17 14:09:51.162266 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.162236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" event={"ID":"d1c063ea-7517-4b61-9ad5-045a966d6633","Type":"ContainerStarted","Data":"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee"} Apr 17 14:09:51.162403 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.162275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" event={"ID":"d1c063ea-7517-4b61-9ad5-045a966d6633","Type":"ContainerStarted","Data":"4963ba0630d1f45daa18c940693bb4a217ca9c03ab15c1e770ec6434d02d3715"} Apr 17 14:09:51.162605 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.162563 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:09:51.183334 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:51.183290 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" podStartSLOduration=164.183272075 podStartE2EDuration="2m44.183272075s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:09:51.181907733 +0000 UTC m=+165.080241027" watchObservedRunningTime="2026-04-17 14:09:51.183272075 +0000 UTC m=+165.081605371" Apr 17 14:09:52.166190 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:52.166147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" event={"ID":"deebcc5e-3050-4882-a4ed-99aab7e0f695","Type":"ContainerStarted","Data":"58cb5869c7ff3ba1fbaf6f60791513486927073f3d8243c4bb359a5350e2064c"} Apr 17 14:09:52.182916 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:52.182868 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tcvd8" podStartSLOduration=164.150379896 podStartE2EDuration="2m45.182853903s" podCreationTimestamp="2026-04-17 14:07:07 +0000 UTC" firstStartedPulling="2026-04-17 14:09:50.412720006 +0000 UTC m=+164.311053283" lastFinishedPulling="2026-04-17 14:09:51.445194008 +0000 UTC m=+165.343527290" observedRunningTime="2026-04-17 14:09:52.181958972 +0000 UTC m=+166.080292267" watchObservedRunningTime="2026-04-17 14:09:52.182853903 +0000 UTC m=+166.081187196" Apr 17 14:09:53.169768 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:53.169730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d88zh" event={"ID":"1659be2e-44e8-44dd-aa82-e6c78b31f5e1","Type":"ContainerStarted","Data":"c1fedf5277e974cd8ad78f00304718e3fb2ac17e25c50fcfe3d095d294ef2103"} Apr 17 14:09:53.171202 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:53.171176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4cg5l" event={"ID":"453f3789-5df6-45c5-bf09-cad968b4e751","Type":"ContainerStarted","Data":"d1b0c13dc056ef507762d9e378f709dd44b1cfe3e88f27aca319645d01b267b6"} Apr 17 14:09:53.171202 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:53.171206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4cg5l" event={"ID":"453f3789-5df6-45c5-bf09-cad968b4e751","Type":"ContainerStarted","Data":"3603540fd41f263aab4d5ae959da68d180eefd9aa7922a0bfbaa37154ce2c45a"} Apr 17 14:09:53.185481 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:53.185419 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d88zh" podStartSLOduration=1.578965073 podStartE2EDuration="5.1854048s" podCreationTimestamp="2026-04-17 14:09:48 +0000 UTC" firstStartedPulling="2026-04-17 14:09:48.979710654 +0000 UTC m=+162.878043926" lastFinishedPulling="2026-04-17 14:09:52.58615038 +0000 UTC m=+166.484483653" observedRunningTime="2026-04-17 14:09:53.185333352 +0000 UTC m=+167.083666647" watchObservedRunningTime="2026-04-17 14:09:53.1854048 +0000 UTC m=+167.083738092" Apr 17 14:09:53.203971 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:53.201699 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4cg5l" podStartSLOduration=129.99388599 podStartE2EDuration="2m12.201685357s" podCreationTimestamp="2026-04-17 14:07:41 +0000 UTC" firstStartedPulling="2026-04-17 14:09:50.381097581 +0000 UTC m=+164.279430852" lastFinishedPulling="2026-04-17 14:09:52.588896937 +0000 UTC m=+166.487230219" observedRunningTime="2026-04-17 14:09:53.200701329 +0000 UTC m=+167.099034623" watchObservedRunningTime="2026-04-17 14:09:53.201685357 +0000 UTC m=+167.100018652" Apr 17 14:09:54.174526 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:54.174492 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4cg5l" Apr 17 14:09:57.708703 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.708625 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:09:57.711231 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.711213 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nsnr8\"" Apr 17 14:09:57.719269 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.719252 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cb9x6" Apr 17 14:09:57.811683 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.811656 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wt2hd"] Apr 17 14:09:57.816709 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.816689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.819145 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.819114 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 14:09:57.819263 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.819209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 14:09:57.819437 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.819412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 14:09:57.820373 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.820352 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 14:09:57.820373 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.820365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t554z\"" Apr 17 14:09:57.820606 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.820495 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 14:09:57.820774 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.820758 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 14:09:57.831323 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.831303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cb9x6"] Apr 17 14:09:57.834210 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:57.834188 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e2cd45_279f_4653_a7e2_d58678593e5d.slice/crio-5cdfd018da8c2a5ff9d642c4b8e4b6702b273c8cd55f8498924ac547489e7ae1 WatchSource:0}: Error finding container 5cdfd018da8c2a5ff9d642c4b8e4b6702b273c8cd55f8498924ac547489e7ae1: Status 404 returned error can't find the container with id 5cdfd018da8c2a5ff9d642c4b8e4b6702b273c8cd55f8498924ac547489e7ae1 Apr 17 14:09:57.975824 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-textfile\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.975824 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-sys\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.975824 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-metrics-client-ca\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.975824 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-root\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.976046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-tls\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.976046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975904 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.976046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-accelerators-collector-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.976046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz48n\" (UniqueName: \"kubernetes.io/projected/92e23e6b-9e35-43f9-8199-d37ec0776cd7-kube-api-access-cz48n\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:57.976046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:57.975973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-wtmp\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077252 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-root\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-tls\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077314 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-root\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077372 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-accelerators-collector-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz48n\" (UniqueName: \"kubernetes.io/projected/92e23e6b-9e35-43f9-8199-d37ec0776cd7-kube-api-access-cz48n\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-wtmp\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-textfile\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-sys\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-metrics-client-ca\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077600 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-sys\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077861 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-wtmp\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.077938 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.077918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-textfile\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.078184 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.078167 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-metrics-client-ca\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.078236 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.078214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-accelerators-collector-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.079650 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.079631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.079735 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.079706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/92e23e6b-9e35-43f9-8199-d37ec0776cd7-node-exporter-tls\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.084316 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.084295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz48n\" (UniqueName: \"kubernetes.io/projected/92e23e6b-9e35-43f9-8199-d37ec0776cd7-kube-api-access-cz48n\") pod \"node-exporter-wt2hd\" (UID: \"92e23e6b-9e35-43f9-8199-d37ec0776cd7\") " pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.126541 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.126516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wt2hd" Apr 17 14:09:58.134063 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:09:58.134043 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e23e6b_9e35_43f9_8199_d37ec0776cd7.slice/crio-e03f689a1dc1838457fe8eb80ffc2837aa48a703b986a23674b2cb4eae2c9a1b WatchSource:0}: Error finding container e03f689a1dc1838457fe8eb80ffc2837aa48a703b986a23674b2cb4eae2c9a1b: Status 404 returned error can't find the container with id e03f689a1dc1838457fe8eb80ffc2837aa48a703b986a23674b2cb4eae2c9a1b Apr 17 14:09:58.184391 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.184368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wt2hd" event={"ID":"92e23e6b-9e35-43f9-8199-d37ec0776cd7","Type":"ContainerStarted","Data":"e03f689a1dc1838457fe8eb80ffc2837aa48a703b986a23674b2cb4eae2c9a1b"} Apr 17 14:09:58.185242 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:58.185222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cb9x6" event={"ID":"63e2cd45-279f-4653-a7e2-d58678593e5d","Type":"ContainerStarted","Data":"5cdfd018da8c2a5ff9d642c4b8e4b6702b273c8cd55f8498924ac547489e7ae1"} Apr 17 14:09:59.189960 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:09:59.189926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wt2hd" event={"ID":"92e23e6b-9e35-43f9-8199-d37ec0776cd7","Type":"ContainerStarted","Data":"402b5d78cfdaf3c7f1481a50f69f8aca9ebaae0bfb30aff4d72a733bd51ec741"} Apr 17 14:10:00.194352 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:00.194319 2578 generic.go:358] "Generic (PLEG): container finished" podID="92e23e6b-9e35-43f9-8199-d37ec0776cd7" containerID="402b5d78cfdaf3c7f1481a50f69f8aca9ebaae0bfb30aff4d72a733bd51ec741" exitCode=0 Apr 17 14:10:00.194840 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:00.194355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wt2hd" event={"ID":"92e23e6b-9e35-43f9-8199-d37ec0776cd7","Type":"ContainerDied","Data":"402b5d78cfdaf3c7f1481a50f69f8aca9ebaae0bfb30aff4d72a733bd51ec741"} Apr 17 14:10:01.198539 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.198498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wt2hd" event={"ID":"92e23e6b-9e35-43f9-8199-d37ec0776cd7","Type":"ContainerStarted","Data":"be5eabb29a6f1762663fb797c99d8bc689c67deb0d43dac80318a6e7024a49d0"} Apr 17 14:10:01.198539 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.198540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wt2hd" event={"ID":"92e23e6b-9e35-43f9-8199-d37ec0776cd7","Type":"ContainerStarted","Data":"07be8cf878a5e57fc12daa25e3f1100ae0c763adfdb0334c72af56bba3b3d5b3"} Apr 17 14:10:01.199738 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.199715 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cb9x6" event={"ID":"63e2cd45-279f-4653-a7e2-d58678593e5d","Type":"ContainerStarted","Data":"6f4052eae690e60f446bc083915675914c3c6d9c8e4d3af43add0b44c67a04dc"} Apr 17 14:10:01.216818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.216774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wt2hd" podStartSLOduration=3.324991765 podStartE2EDuration="4.216761639s" podCreationTimestamp="2026-04-17 14:09:57 +0000 UTC" firstStartedPulling="2026-04-17 14:09:58.135501542 +0000 UTC m=+172.033834813" lastFinishedPulling="2026-04-17 14:09:59.027271401 +0000 UTC m=+172.925604687" observedRunningTime="2026-04-17 14:10:01.216370154 +0000 UTC m=+175.114703448" watchObservedRunningTime="2026-04-17 14:10:01.216761639 +0000 UTC m=+175.115094968" Apr 17 14:10:01.246279 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.246237 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cb9x6" podStartSLOduration=137.675636897 podStartE2EDuration="2m20.246224595s" podCreationTimestamp="2026-04-17 14:07:41 +0000 UTC" firstStartedPulling="2026-04-17 14:09:57.840889781 +0000 UTC m=+171.739223053" lastFinishedPulling="2026-04-17 14:10:00.411477475 +0000 UTC m=+174.309810751" observedRunningTime="2026-04-17 14:10:01.245654844 +0000 UTC m=+175.143988137" watchObservedRunningTime="2026-04-17 14:10:01.246224595 +0000 UTC m=+175.144557889" Apr 17 14:10:01.709047 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:01.709024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:10:04.179026 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:04.178991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4cg5l" Apr 17 14:10:10.251096 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:10.251065 2578 patch_prober.go:28] interesting pod/image-registry-6d59c9f569-ksngn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:10:10.251481 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:10.251119 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:10:10.376791 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:10.376759 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:10:10.380690 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:10.380665 2578 patch_prober.go:28] interesting pod/image-registry-6d59c9f569-ksngn container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 14:10:10.380826 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:10.380716 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 14:10:20.381784 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:20.381754 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:10:35.395132 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.395067 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" containerID="cri-o://c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee" gracePeriod=30 Apr 17 14:10:35.647174 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.647117 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:10:35.756044 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756014 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756220 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756052 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756220 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756069 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4r9l\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756220 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756100 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756220 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756119 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756461 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756335 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756461 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756461 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756451 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token\") pod \"d1c063ea-7517-4b61-9ad5-045a966d6633\" (UID: \"d1c063ea-7517-4b61-9ad5-045a966d6633\") " Apr 17 14:10:35.756713 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756650 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:35.756840 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.756740 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 14:10:35.758585 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.758545 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:10:35.758930 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.758900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:35.759023 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.758980 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 14:10:35.759023 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.758981 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l" (OuterVolumeSpecName: "kube-api-access-j4r9l") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "kube-api-access-j4r9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:10:35.759136 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.759067 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:10:35.764615 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.764589 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d1c063ea-7517-4b61-9ad5-045a966d6633" (UID: "d1c063ea-7517-4b61-9ad5-045a966d6633"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:10:35.857269 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857230 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-bound-sa-token\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857269 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857264 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-tls\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857269 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857273 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-trusted-ca\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857269 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857282 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4r9l\" (UniqueName: \"kubernetes.io/projected/d1c063ea-7517-4b61-9ad5-045a966d6633-kube-api-access-j4r9l\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857506 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857291 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-installation-pull-secrets\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857506 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857300 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1c063ea-7517-4b61-9ad5-045a966d6633-image-registry-private-configuration\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857506 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857310 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1c063ea-7517-4b61-9ad5-045a966d6633-registry-certificates\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:35.857506 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:35.857319 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1c063ea-7517-4b61-9ad5-045a966d6633-ca-trust-extracted\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:10:36.292655 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.292618 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerID="c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee" exitCode=0 Apr 17 14:10:36.292818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.292684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" event={"ID":"d1c063ea-7517-4b61-9ad5-045a966d6633","Type":"ContainerDied","Data":"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee"} Apr 17 14:10:36.292818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.292710 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" Apr 17 14:10:36.292818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.292717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59c9f569-ksngn" event={"ID":"d1c063ea-7517-4b61-9ad5-045a966d6633","Type":"ContainerDied","Data":"4963ba0630d1f45daa18c940693bb4a217ca9c03ab15c1e770ec6434d02d3715"} Apr 17 14:10:36.292818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.292737 2578 scope.go:117] "RemoveContainer" containerID="c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee" Apr 17 14:10:36.300806 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.300788 2578 scope.go:117] "RemoveContainer" containerID="c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee" Apr 17 14:10:36.301019 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:10:36.301001 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee\": container with ID starting with c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee not found: ID does not exist" containerID="c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee" Apr 17 14:10:36.301065 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.301027 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee"} err="failed to get container status \"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee\": rpc error: code = NotFound desc = could not find container \"c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee\": container with ID starting with c20a6a40625e69ac87a4dc88c43004f90327649db1fe18c4de515f80a4a618ee not found: ID does not exist" Apr 17 14:10:36.311465 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.311446 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:10:36.314370 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.314352 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d59c9f569-ksngn"] Apr 17 14:10:36.711569 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:36.711545 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" path="/var/lib/kubelet/pods/d1c063ea-7517-4b61-9ad5-045a966d6633/volumes" Apr 17 14:10:42.338025 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:42.337989 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" podUID="7a43d970-3485-47bf-8573-7539f592977a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:10:42.501717 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:42.501684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cb9x6_63e2cd45-279f-4653-a7e2-d58678593e5d/serve-healthcheck-canary/0.log" Apr 17 14:10:52.338223 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:10:52.338183 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" podUID="7a43d970-3485-47bf-8573-7539f592977a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:11:02.337566 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:02.337526 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" podUID="7a43d970-3485-47bf-8573-7539f592977a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 14:11:02.337936 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:02.337594 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" Apr 17 14:11:02.338032 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:02.338015 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"204fff99ba42f59061610aaa6ffaf34176bd8df1ea165550a2a84803ce631241"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 14:11:02.338068 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:02.338051 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" podUID="7a43d970-3485-47bf-8573-7539f592977a" containerName="service-proxy" containerID="cri-o://204fff99ba42f59061610aaa6ffaf34176bd8df1ea165550a2a84803ce631241" gracePeriod=30 Apr 17 14:11:03.359485 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:03.359446 2578 generic.go:358] "Generic (PLEG): container finished" podID="7a43d970-3485-47bf-8573-7539f592977a" containerID="204fff99ba42f59061610aaa6ffaf34176bd8df1ea165550a2a84803ce631241" exitCode=2 Apr 17 14:11:03.359845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:03.359454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerDied","Data":"204fff99ba42f59061610aaa6ffaf34176bd8df1ea165550a2a84803ce631241"} Apr 17 14:11:03.359845 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:03.359537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86cd74555b-z4hvk" event={"ID":"7a43d970-3485-47bf-8573-7539f592977a","Type":"ContainerStarted","Data":"71ab7a3dd64d7929dee2e01268da80e5b2f72f3d2fda828ce48312f2d66216b3"} Apr 17 14:11:18.555380 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:18.555339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:11:18.557570 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:18.557550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86e32c2b-962f-40a8-9171-5be908eeee49-metrics-certs\") pod \"network-metrics-daemon-7nttt\" (UID: \"86e32c2b-962f-40a8-9171-5be908eeee49\") " pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:11:18.812840 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:18.812760 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bfspf\"" Apr 17 14:11:18.820767 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:18.820740 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7nttt" Apr 17 14:11:18.930461 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:18.930421 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7nttt"] Apr 17 14:11:18.932638 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:11:18.932610 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e32c2b_962f_40a8_9171_5be908eeee49.slice/crio-de2a18943d56ddff244a5eb8fcb5c81eb617fe96036f3d84cedfe8d7c41b5202 WatchSource:0}: Error finding container de2a18943d56ddff244a5eb8fcb5c81eb617fe96036f3d84cedfe8d7c41b5202: Status 404 returned error can't find the container with id de2a18943d56ddff244a5eb8fcb5c81eb617fe96036f3d84cedfe8d7c41b5202 Apr 17 14:11:19.396402 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:19.396368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7nttt" event={"ID":"86e32c2b-962f-40a8-9171-5be908eeee49","Type":"ContainerStarted","Data":"de2a18943d56ddff244a5eb8fcb5c81eb617fe96036f3d84cedfe8d7c41b5202"} Apr 17 14:11:20.403343 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:20.403308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7nttt" event={"ID":"86e32c2b-962f-40a8-9171-5be908eeee49","Type":"ContainerStarted","Data":"4aff9930f06d0a030774d5ff73a05b7240241c8844420103e6feecf7cf7980eb"} Apr 17 14:11:21.407509 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:21.407471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7nttt" event={"ID":"86e32c2b-962f-40a8-9171-5be908eeee49","Type":"ContainerStarted","Data":"b091269c06c79b672f6f8d84b9828f1987e5baad274372b31271167a74fe252a"} Apr 17 14:11:21.423800 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:11:21.423562 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7nttt" podStartSLOduration=254.093182154 podStartE2EDuration="4m15.423545046s" podCreationTimestamp="2026-04-17 14:07:06 +0000 UTC" firstStartedPulling="2026-04-17 14:11:18.93465813 +0000 UTC m=+252.832991416" lastFinishedPulling="2026-04-17 14:11:20.265021037 +0000 UTC m=+254.163354308" observedRunningTime="2026-04-17 14:11:21.421800839 +0000 UTC m=+255.320134144" watchObservedRunningTime="2026-04-17 14:11:21.423545046 +0000 UTC m=+255.321878340" Apr 17 14:12:06.586722 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:12:06.586697 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 14:13:06.037165 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.037074 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nb7r9"] Apr 17 14:13:06.037588 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.037300 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" Apr 17 14:13:06.037588 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.037311 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" Apr 17 14:13:06.037588 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.037348 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1c063ea-7517-4b61-9ad5-045a966d6633" containerName="registry" Apr 17 14:13:06.040091 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.040075 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.042504 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.042481 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 14:13:06.042621 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.042556 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-kgxw2\"" Apr 17 14:13:06.043551 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.043536 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 14:13:06.049005 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.048981 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nb7r9"] Apr 17 14:13:06.147081 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.147047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.147265 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.147107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hm2l\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-kube-api-access-7hm2l\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.247441 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.247402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hm2l\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-kube-api-access-7hm2l\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.247592 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.247456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.256348 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.256318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.256709 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.256689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hm2l\" (UniqueName: \"kubernetes.io/projected/ef3f7d47-ffb2-4cf1-a74b-f936e2f76006-kube-api-access-7hm2l\") pod \"cert-manager-cainjector-8966b78d4-nb7r9\" (UID: \"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.348818 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.348738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" Apr 17 14:13:06.459313 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.459279 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nb7r9"] Apr 17 14:13:06.462646 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:13:06.462620 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef3f7d47_ffb2_4cf1_a74b_f936e2f76006.slice/crio-b35607daae61caa5b2a7691be865571e166d136af5abe991f1ea9a65b02363f5 WatchSource:0}: Error finding container b35607daae61caa5b2a7691be865571e166d136af5abe991f1ea9a65b02363f5: Status 404 returned error can't find the container with id b35607daae61caa5b2a7691be865571e166d136af5abe991f1ea9a65b02363f5 Apr 17 14:13:06.464416 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.464397 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:13:06.662124 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:06.662044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" event={"ID":"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006","Type":"ContainerStarted","Data":"b35607daae61caa5b2a7691be865571e166d136af5abe991f1ea9a65b02363f5"} Apr 17 14:13:10.673185 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:10.673148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" event={"ID":"ef3f7d47-ffb2-4cf1-a74b-f936e2f76006","Type":"ContainerStarted","Data":"5e99e29cc9ad9854b64a5ef391fb94c43d940aca9a568d87b22f431d4f73ee89"} Apr 17 14:13:10.688525 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:13:10.688480 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-nb7r9" podStartSLOduration=1.427279303 podStartE2EDuration="4.688466901s" podCreationTimestamp="2026-04-17 14:13:06 +0000 UTC" firstStartedPulling="2026-04-17 14:13:06.464540594 +0000 UTC m=+360.362873867" lastFinishedPulling="2026-04-17 14:13:09.725728188 +0000 UTC m=+363.624061465" observedRunningTime="2026-04-17 14:13:10.687770461 +0000 UTC m=+364.586103756" watchObservedRunningTime="2026-04-17 14:13:10.688466901 +0000 UTC m=+364.586800195" Apr 17 14:14:59.841065 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.841032 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs"] Apr 17 14:14:59.843788 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.843771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:14:59.846234 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.846211 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 14:14:59.847296 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.847268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 17 14:14:59.847296 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.847281 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 17 14:14:59.847468 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.847296 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 14:14:59.847468 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.847276 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-zb7zf\"" Apr 17 14:14:59.852029 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:14:59.852008 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs"] Apr 17 14:15:00.003555 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.003519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkb2f\" (UniqueName: \"kubernetes.io/projected/6e302602-de86-4694-8c73-db2cb5233aee-kube-api-access-xkb2f\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.003715 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.003566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e302602-de86-4694-8c73-db2cb5233aee-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.003715 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.003648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6e302602-de86-4694-8c73-db2cb5233aee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.104744 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.104651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6e302602-de86-4694-8c73-db2cb5233aee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.104744 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.104736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkb2f\" (UniqueName: \"kubernetes.io/projected/6e302602-de86-4694-8c73-db2cb5233aee-kube-api-access-xkb2f\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.104928 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.104770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e302602-de86-4694-8c73-db2cb5233aee-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.105371 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.105338 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/6e302602-de86-4694-8c73-db2cb5233aee-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.107099 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.107064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e302602-de86-4694-8c73-db2cb5233aee-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.112263 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.112239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkb2f\" (UniqueName: \"kubernetes.io/projected/6e302602-de86-4694-8c73-db2cb5233aee-kube-api-access-xkb2f\") pod \"kubeflow-trainer-controller-manager-55f5694779-6mnbs\" (UID: \"6e302602-de86-4694-8c73-db2cb5233aee\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.153046 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.153022 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:00.264707 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.264682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs"] Apr 17 14:15:00.266901 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:15:00.266874 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e302602_de86_4694_8c73_db2cb5233aee.slice/crio-46c53c3db7cd48c3877e7a6e14cfcd14c8652fc47367a47bec7c10f8cb3298d6 WatchSource:0}: Error finding container 46c53c3db7cd48c3877e7a6e14cfcd14c8652fc47367a47bec7c10f8cb3298d6: Status 404 returned error can't find the container with id 46c53c3db7cd48c3877e7a6e14cfcd14c8652fc47367a47bec7c10f8cb3298d6 Apr 17 14:15:00.947102 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:00.947054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" event={"ID":"6e302602-de86-4694-8c73-db2cb5233aee","Type":"ContainerStarted","Data":"46c53c3db7cd48c3877e7a6e14cfcd14c8652fc47367a47bec7c10f8cb3298d6"} Apr 17 14:15:02.954409 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:02.954371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" event={"ID":"6e302602-de86-4694-8c73-db2cb5233aee","Type":"ContainerStarted","Data":"3d685412a053a628dadc9361cbaf0afdfb7d1acbb4da7f461a4a8fe742da7795"} Apr 17 14:15:02.954873 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:02.954499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:15:02.970710 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:02.970663 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" podStartSLOduration=1.7407409980000002 podStartE2EDuration="3.970649886s" podCreationTimestamp="2026-04-17 14:14:59 +0000 UTC" firstStartedPulling="2026-04-17 14:15:00.268677741 +0000 UTC m=+474.167011012" lastFinishedPulling="2026-04-17 14:15:02.498586619 +0000 UTC m=+476.396919900" observedRunningTime="2026-04-17 14:15:02.970161324 +0000 UTC m=+476.868494618" watchObservedRunningTime="2026-04-17 14:15:02.970649886 +0000 UTC m=+476.868983180" Apr 17 14:15:18.962721 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:15:18.962693 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-6mnbs" Apr 17 14:55:16.942329 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.942260 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-twf9t/must-gather-trkgq"] Apr 17 14:55:16.945398 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.945385 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:16.947735 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.947711 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twf9t\"/\"openshift-service-ca.crt\"" Apr 17 14:55:16.947875 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.947757 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-twf9t\"/\"default-dockercfg-kppdn\"" Apr 17 14:55:16.947875 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.947720 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twf9t\"/\"kube-root-ca.crt\"" Apr 17 14:55:16.957629 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:16.957606 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twf9t/must-gather-trkgq"] Apr 17 14:55:17.030469 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.030444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmt7b\" (UniqueName: \"kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.030589 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.030476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.131382 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.131334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmt7b\" (UniqueName: \"kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.131382 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.131388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.131765 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.131746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.140082 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.140053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmt7b\" (UniqueName: \"kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b\") pod \"must-gather-trkgq\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.254573 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.254553 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:55:17.368112 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.368080 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twf9t/must-gather-trkgq"] Apr 17 14:55:17.370915 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:55:17.370889 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61aedd6d_110f_4eae_aba4_1e5c2fbdd979.slice/crio-3fef61557901a0ffb5055e622c9aee03ab65adfbcdafb48434b7eb426dbc5adb WatchSource:0}: Error finding container 3fef61557901a0ffb5055e622c9aee03ab65adfbcdafb48434b7eb426dbc5adb: Status 404 returned error can't find the container with id 3fef61557901a0ffb5055e622c9aee03ab65adfbcdafb48434b7eb426dbc5adb Apr 17 14:55:17.372586 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:17.372569 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 14:55:18.068008 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:18.067968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twf9t/must-gather-trkgq" event={"ID":"61aedd6d-110f-4eae-aba4-1e5c2fbdd979","Type":"ContainerStarted","Data":"3fef61557901a0ffb5055e622c9aee03ab65adfbcdafb48434b7eb426dbc5adb"} Apr 17 14:55:22.082101 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:22.082057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twf9t/must-gather-trkgq" event={"ID":"61aedd6d-110f-4eae-aba4-1e5c2fbdd979","Type":"ContainerStarted","Data":"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4"} Apr 17 14:55:23.086005 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:23.085967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twf9t/must-gather-trkgq" event={"ID":"61aedd6d-110f-4eae-aba4-1e5c2fbdd979","Type":"ContainerStarted","Data":"a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b"} Apr 17 14:55:23.101388 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:23.100893 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-twf9t/must-gather-trkgq" podStartSLOduration=2.561370344 podStartE2EDuration="7.100872992s" podCreationTimestamp="2026-04-17 14:55:16 +0000 UTC" firstStartedPulling="2026-04-17 14:55:17.372723408 +0000 UTC m=+2891.271056679" lastFinishedPulling="2026-04-17 14:55:21.912226051 +0000 UTC m=+2895.810559327" observedRunningTime="2026-04-17 14:55:23.099709 +0000 UTC m=+2896.998042296" watchObservedRunningTime="2026-04-17 14:55:23.100872992 +0000 UTC m=+2896.999206286" Apr 17 14:55:31.425913 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:31.425879 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-6mnbs_6e302602-de86-4694-8c73-db2cb5233aee/manager/0.log" Apr 17 14:55:31.847168 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:31.847133 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-6mnbs_6e302602-de86-4694-8c73-db2cb5233aee/manager/0.log" Apr 17 14:55:32.267518 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:55:32.267484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-6mnbs_6e302602-de86-4694-8c73-db2cb5233aee/manager/0.log" Apr 17 14:56:06.207088 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:06.207053 2578 generic.go:358] "Generic (PLEG): container finished" podID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerID="f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4" exitCode=0 Apr 17 14:56:06.207088 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:06.207090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twf9t/must-gather-trkgq" event={"ID":"61aedd6d-110f-4eae-aba4-1e5c2fbdd979","Type":"ContainerDied","Data":"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4"} Apr 17 14:56:06.207562 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:06.207391 2578 scope.go:117] "RemoveContainer" containerID="f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4" Apr 17 14:56:06.634654 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:06.634557 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-twf9t_must-gather-trkgq_61aedd6d-110f-4eae-aba4-1e5c2fbdd979/gather/0.log" Apr 17 14:56:11.967017 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:11.966981 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-twf9t/must-gather-trkgq"] Apr 17 14:56:11.967675 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:11.967238 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-twf9t/must-gather-trkgq" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="copy" containerID="cri-o://a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b" gracePeriod=2 Apr 17 14:56:11.968555 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:11.968530 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-twf9t/must-gather-trkgq"] Apr 17 14:56:11.969290 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:11.969264 2578 status_manager.go:895] "Failed to get status for pod" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" pod="openshift-must-gather-twf9t/must-gather-trkgq" err="pods \"must-gather-trkgq\" is forbidden: User \"system:node:ip-10-0-140-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-twf9t\": no relationship found between node 'ip-10-0-140-205.ec2.internal' and this object" Apr 17 14:56:12.191343 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.191321 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-twf9t_must-gather-trkgq_61aedd6d-110f-4eae-aba4-1e5c2fbdd979/copy/0.log" Apr 17 14:56:12.191678 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.191664 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:56:12.193691 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.193672 2578 status_manager.go:895] "Failed to get status for pod" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" pod="openshift-must-gather-twf9t/must-gather-trkgq" err="pods \"must-gather-trkgq\" is forbidden: User \"system:node:ip-10-0-140-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-twf9t\": no relationship found between node 'ip-10-0-140-205.ec2.internal' and this object" Apr 17 14:56:12.224135 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.224061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-twf9t_must-gather-trkgq_61aedd6d-110f-4eae-aba4-1e5c2fbdd979/copy/0.log" Apr 17 14:56:12.224360 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.224336 2578 generic.go:358] "Generic (PLEG): container finished" podID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerID="a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b" exitCode=143 Apr 17 14:56:12.224448 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.224393 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twf9t/must-gather-trkgq" Apr 17 14:56:12.224518 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.224456 2578 scope.go:117] "RemoveContainer" containerID="a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b" Apr 17 14:56:12.226599 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.226573 2578 status_manager.go:895] "Failed to get status for pod" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" pod="openshift-must-gather-twf9t/must-gather-trkgq" err="pods \"must-gather-trkgq\" is forbidden: User \"system:node:ip-10-0-140-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-twf9t\": no relationship found between node 'ip-10-0-140-205.ec2.internal' and this object" Apr 17 14:56:12.231134 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.231118 2578 scope.go:117] "RemoveContainer" containerID="f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4" Apr 17 14:56:12.242165 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.242138 2578 scope.go:117] "RemoveContainer" containerID="a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b" Apr 17 14:56:12.242393 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:56:12.242373 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b\": container with ID starting with a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b not found: ID does not exist" containerID="a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b" Apr 17 14:56:12.242467 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.242402 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b"} err="failed to get container status \"a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b\": rpc error: code = NotFound desc = could not find container \"a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b\": container with ID starting with a26294a1f89b8ab0daa9baf604708debfea912e75f22730f92378a02ae353f2b not found: ID does not exist" Apr 17 14:56:12.242467 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.242421 2578 scope.go:117] "RemoveContainer" containerID="f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4" Apr 17 14:56:12.242677 ip-10-0-140-205 kubenswrapper[2578]: E0417 14:56:12.242661 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4\": container with ID starting with f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4 not found: ID does not exist" containerID="f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4" Apr 17 14:56:12.242717 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.242682 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4"} err="failed to get container status \"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4\": rpc error: code = NotFound desc = could not find container \"f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4\": container with ID starting with f9b27a113eb76e5718a00d90488d46293563a4a08d2f7e7ae55312984ebfadb4 not found: ID does not exist" Apr 17 14:56:12.249066 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.249050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output\") pod \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " Apr 17 14:56:12.249128 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.249080 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmt7b\" (UniqueName: \"kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b\") pod \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\" (UID: \"61aedd6d-110f-4eae-aba4-1e5c2fbdd979\") " Apr 17 14:56:12.251206 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.251177 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b" (OuterVolumeSpecName: "kube-api-access-tmt7b") pod "61aedd6d-110f-4eae-aba4-1e5c2fbdd979" (UID: "61aedd6d-110f-4eae-aba4-1e5c2fbdd979"). InnerVolumeSpecName "kube-api-access-tmt7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 14:56:12.251326 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.251307 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "61aedd6d-110f-4eae-aba4-1e5c2fbdd979" (UID: "61aedd6d-110f-4eae-aba4-1e5c2fbdd979"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 14:56:12.349693 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.349661 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-must-gather-output\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:56:12.349693 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.349686 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmt7b\" (UniqueName: \"kubernetes.io/projected/61aedd6d-110f-4eae-aba4-1e5c2fbdd979-kube-api-access-tmt7b\") on node \"ip-10-0-140-205.ec2.internal\" DevicePath \"\"" Apr 17 14:56:12.534268 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.534240 2578 status_manager.go:895] "Failed to get status for pod" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" pod="openshift-must-gather-twf9t/must-gather-trkgq" err="pods \"must-gather-trkgq\" is forbidden: User \"system:node:ip-10-0-140-205.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-twf9t\": no relationship found between node 'ip-10-0-140-205.ec2.internal' and this object" Apr 17 14:56:12.712706 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:12.712677 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" path="/var/lib/kubelet/pods/61aedd6d-110f-4eae-aba4-1e5c2fbdd979/volumes" Apr 17 14:56:59.133284 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:59.133253 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bdklc_561daaa7-247f-4820-939b-375dd242bbf3/global-pull-secret-syncer/0.log" Apr 17 14:56:59.277550 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:59.277518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4xvtj_76ab9c94-30da-41a0-954f-373b613d462f/konnectivity-agent/0.log" Apr 17 14:56:59.409584 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:56:59.409515 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-205.ec2.internal_af0e8e30f34475421e9e054e83fe8eba/haproxy/0.log" Apr 17 14:57:02.988295 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:02.988267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wt2hd_92e23e6b-9e35-43f9-8199-d37ec0776cd7/node-exporter/0.log" Apr 17 14:57:03.017940 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:03.017900 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wt2hd_92e23e6b-9e35-43f9-8199-d37ec0776cd7/kube-rbac-proxy/0.log" Apr 17 14:57:03.038193 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:03.038169 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wt2hd_92e23e6b-9e35-43f9-8199-d37ec0776cd7/init-textfile/0.log" Apr 17 14:57:04.648640 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:04.648613 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-tcvd8_deebcc5e-3050-4882-a4ed-99aab7e0f695/networking-console-plugin/0.log" Apr 17 14:57:06.576413 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:06.576373 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4cg5l_453f3789-5df6-45c5-bf09-cad968b4e751/dns/0.log" Apr 17 14:57:06.594976 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:06.594951 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4cg5l_453f3789-5df6-45c5-bf09-cad968b4e751/kube-rbac-proxy/0.log" Apr 17 14:57:06.716648 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:06.716627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vnnct_8411d38d-222c-4fc3-9890-84fb21b47535/dns-node-resolver/0.log" Apr 17 14:57:07.142542 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.142505 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wkj7_fb36c782-6ab6-4f4d-91bc-6792c5debe41/node-ca/0.log" Apr 17 14:57:07.254943 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.254914 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44"] Apr 17 14:57:07.255222 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255207 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="gather" Apr 17 14:57:07.255272 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255226 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="gather" Apr 17 14:57:07.255272 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255242 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="copy" Apr 17 14:57:07.255272 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255250 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="copy" Apr 17 14:57:07.255370 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255301 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="gather" Apr 17 14:57:07.255370 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.255312 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="61aedd6d-110f-4eae-aba4-1e5c2fbdd979" containerName="copy" Apr 17 14:57:07.259325 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.259309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.261654 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.261630 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wh6bh\"/\"openshift-service-ca.crt\"" Apr 17 14:57:07.261766 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.261684 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wh6bh\"/\"kube-root-ca.crt\"" Apr 17 14:57:07.262725 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.262708 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wh6bh\"/\"default-dockercfg-g56sz\"" Apr 17 14:57:07.266175 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.266146 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44"] Apr 17 14:57:07.312050 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.312029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-sys\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.312168 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.312059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-lib-modules\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.312168 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.312083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-podres\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.312168 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.312111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-proc\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.312168 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.312151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhftz\" (UniqueName: \"kubernetes.io/projected/49c5953c-48c3-4323-8eaa-061d234b8f3f-kube-api-access-vhftz\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413106 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-lib-modules\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413106 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-podres\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413106 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-proc\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413270 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhftz\" (UniqueName: \"kubernetes.io/projected/49c5953c-48c3-4323-8eaa-061d234b8f3f-kube-api-access-vhftz\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413270 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-sys\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413270 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-podres\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413270 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-lib-modules\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413270 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413249 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-sys\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.413416 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.413314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/49c5953c-48c3-4323-8eaa-061d234b8f3f-proc\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.420813 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.420795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhftz\" (UniqueName: \"kubernetes.io/projected/49c5953c-48c3-4323-8eaa-061d234b8f3f-kube-api-access-vhftz\") pod \"perf-node-gather-daemonset-f4x44\" (UID: \"49c5953c-48c3-4323-8eaa-061d234b8f3f\") " pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.569318 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.569285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:07.677070 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:07.677009 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44"] Apr 17 14:57:07.679789 ip-10-0-140-205 kubenswrapper[2578]: W0417 14:57:07.679760 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod49c5953c_48c3_4323_8eaa_061d234b8f3f.slice/crio-44d0f79423004d00543d6beb36c7ae6a0a0eb22d2e88b798fb7c746fbfe4144c WatchSource:0}: Error finding container 44d0f79423004d00543d6beb36c7ae6a0a0eb22d2e88b798fb7c746fbfe4144c: Status 404 returned error can't find the container with id 44d0f79423004d00543d6beb36c7ae6a0a0eb22d2e88b798fb7c746fbfe4144c Apr 17 14:57:08.171689 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.171655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cb9x6_63e2cd45-279f-4653-a7e2-d58678593e5d/serve-healthcheck-canary/0.log" Apr 17 14:57:08.362847 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.362814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" event={"ID":"49c5953c-48c3-4323-8eaa-061d234b8f3f","Type":"ContainerStarted","Data":"525d5e01a4d318ff3116b92846d11a705387a821696eba1acece62ea841210d3"} Apr 17 14:57:08.362847 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.362848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" event={"ID":"49c5953c-48c3-4323-8eaa-061d234b8f3f","Type":"ContainerStarted","Data":"44d0f79423004d00543d6beb36c7ae6a0a0eb22d2e88b798fb7c746fbfe4144c"} Apr 17 14:57:08.363090 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.362922 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:08.377103 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.377035 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" podStartSLOduration=1.37702082 podStartE2EDuration="1.37702082s" podCreationTimestamp="2026-04-17 14:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 14:57:08.37620332 +0000 UTC m=+3002.274536615" watchObservedRunningTime="2026-04-17 14:57:08.37702082 +0000 UTC m=+3002.275354114" Apr 17 14:57:08.579651 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.579620 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d88zh_1659be2e-44e8-44dd-aa82-e6c78b31f5e1/kube-rbac-proxy/0.log" Apr 17 14:57:08.597108 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.597082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d88zh_1659be2e-44e8-44dd-aa82-e6c78b31f5e1/exporter/0.log" Apr 17 14:57:08.614713 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:08.614689 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d88zh_1659be2e-44e8-44dd-aa82-e6c78b31f5e1/extractor/0.log" Apr 17 14:57:14.158740 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.158712 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/kube-multus-additional-cni-plugins/0.log" Apr 17 14:57:14.177766 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.177737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/egress-router-binary-copy/0.log" Apr 17 14:57:14.195230 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.195204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/cni-plugins/0.log" Apr 17 14:57:14.236904 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.236879 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/bond-cni-plugin/0.log" Apr 17 14:57:14.260278 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.260255 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/routeoverride-cni/0.log" Apr 17 14:57:14.311172 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.311125 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/whereabouts-cni-bincopy/0.log" Apr 17 14:57:14.332362 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.332331 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6gr7b_01f1152c-e1c7-4d93-ad8f-877078fb7271/whereabouts-cni/0.log" Apr 17 14:57:14.374689 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.374663 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wh6bh/perf-node-gather-daemonset-f4x44" Apr 17 14:57:14.691719 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.691690 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wfn9n_4d8f653c-fe7e-4b15-8b7c-eaa897b23b78/kube-multus/0.log" Apr 17 14:57:14.709093 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.709061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7nttt_86e32c2b-962f-40a8-9171-5be908eeee49/network-metrics-daemon/0.log" Apr 17 14:57:14.730418 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:14.730390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7nttt_86e32c2b-962f-40a8-9171-5be908eeee49/kube-rbac-proxy/0.log" Apr 17 14:57:15.510115 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.510087 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/ovn-controller/0.log" Apr 17 14:57:15.552302 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.552278 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/ovn-acl-logging/0.log" Apr 17 14:57:15.572903 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.572877 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/kube-rbac-proxy-node/0.log" Apr 17 14:57:15.593785 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.593768 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 14:57:15.609273 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.609256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/northd/0.log" Apr 17 14:57:15.627789 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.627769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/nbdb/0.log" Apr 17 14:57:15.648787 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.648767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/sbdb/0.log" Apr 17 14:57:15.801274 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:15.801204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5h9m4_83bdba87-08d8-4d12-9786-7c1ee404890a/ovnkube-controller/0.log" Apr 17 14:57:17.666245 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:17.666217 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-spfv9_4a41c39c-5f5e-4ad7-8c82-dac16f59266d/network-check-target-container/0.log" Apr 17 14:57:18.569054 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:18.569025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-c28zr_53b66d5d-5930-40dd-8dcf-45ee6f0c9cf4/iptables-alerter/0.log" Apr 17 14:57:19.122194 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:19.122159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dxxnk_fd130f1a-180c-45df-b438-589d809223cc/tuned/0.log" Apr 17 14:57:22.535339 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:22.535308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-vzk4x_d93d961b-181a-4922-b8e6-299ecf0ab597/csi-driver/0.log" Apr 17 14:57:22.556327 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:22.556302 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-vzk4x_d93d961b-181a-4922-b8e6-299ecf0ab597/csi-node-driver-registrar/0.log" Apr 17 14:57:22.575082 ip-10-0-140-205 kubenswrapper[2578]: I0417 14:57:22.575025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-vzk4x_d93d961b-181a-4922-b8e6-299ecf0ab597/csi-liveness-probe/0.log"