Apr 22 18:20:41.826536 ip-10-0-128-248 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:20:42.337953 ip-10-0-128-248 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:42.337953 ip-10-0-128-248 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:20:42.337953 ip-10-0-128-248 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:42.337953 ip-10-0-128-248 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:20:42.337953 ip-10-0-128-248 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:42.341733 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.341641 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:20:42.344802 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344788 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:42.344802 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344802 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344806 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344810 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344813 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344816 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344818 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344821 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344825 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344827 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344831 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344833 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344836 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344838 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344841 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344844 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344846 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344854 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344857 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344860 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344862 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:42.344864 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344867 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344871 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344874 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344877 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344880 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344883 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344885 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344888 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344890 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344893 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344896 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344898 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344901 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344903 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344906 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344909 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344911 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344914 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344916 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344920 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:42.345359 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344922 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344925 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344927 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344930 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344932 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344934 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344937 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344940 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344942 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344945 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344947 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344950 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344953 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344955 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344958 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344961 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344963 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344966 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344968 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344970 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:42.345836 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344973 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344975 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344978 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344981 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344984 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344987 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344989 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344992 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344994 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344997 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.344999 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345001 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345004 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345006 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345009 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345011 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345014 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345016 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345021 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345025 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:42.346334 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345027 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345030 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345033 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345035 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345037 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345431 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345437 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345440 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345443 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345445 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345448 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345451 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345453 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345456 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345459 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345461 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345464 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345467 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345469 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345472 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345474 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:42.346822 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345477 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345479 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345482 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345485 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345487 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345489 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345492 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345495 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345498 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345502 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345506 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345509 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345511 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345514 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345517 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345519 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345521 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345525 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345527 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345529 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:42.347345 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345532 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345534 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345537 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345539 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345542 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345544 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345546 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345549 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345551 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345554 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345557 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345559 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345561 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345564 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345566 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345568 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345571 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345573 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345576 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345578 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:42.347856 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345580 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345583 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345586 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345588 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345591 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345593 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345596 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345598 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345601 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345604 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345607 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345609 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345611 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345614 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345616 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345619 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345621 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345623 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345626 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345629 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:42.348360 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345631 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345633 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345639 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345644 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345647 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345650 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345653 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345655 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345658 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.345661 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345733 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345741 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345746 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345751 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345756 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345759 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345763 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345768 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345771 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345774 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345778 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:20:42.348855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345781 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345784 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345787 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345791 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345794 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345796 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345799 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345802 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345806 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345809 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345812 2578 flags.go:64] FLAG: --config-dir="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345815 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345818 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345822 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345826 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345829 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345832 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345835 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345838 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345841 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345844 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345848 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345852 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345855 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345858 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345861 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:20:42.349380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345865 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345868 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345872 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345875 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345878 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345881 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345884 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345888 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345891 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345894 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345896 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345900 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345902 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345905 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345908 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345911 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345914 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345916 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345920 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345923 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345926 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345929 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345932 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345935 2578 flags.go:64] FLAG: --help="false" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345938 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.350017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345941 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345944 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345947 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345950 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345954 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345957 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345960 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345963 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345965 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345968 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345971 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345974 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345977 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345980 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345984 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345987 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345990 2578 flags.go:64] FLAG: --lock-file="" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345993 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345995 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.345998 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346003 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346006 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346009 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346012 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 18:20:42.350624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346015 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346018 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346021 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346024 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346028 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346031 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346035 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346038 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346041 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346044 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346047 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346050 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346053 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346058 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346066 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346069 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346071 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346074 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346077 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346083 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346085 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346090 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346093 2578 flags.go:64] FLAG: --port="10250" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346096 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:20:42.351211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346099 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00b767f340c951769" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346102 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346105 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346109 2578 flags.go:64] FLAG: --register-node="true" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346111 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346114 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346118 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346120 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346123 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346126 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346129 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346132 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346135 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346138 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346141 2578 flags.go:64] FLAG: --runonce="false" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346144 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346147 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346150 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346153 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346156 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346159 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346161 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346165 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346168 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346171 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346174 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:20:42.351904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346177 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346180 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346183 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346187 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346192 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346196 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346198 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346202 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346205 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346208 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346210 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346213 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346216 2578 flags.go:64] FLAG: --v="2" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346221 2578 flags.go:64] FLAG: --version="false" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346238 2578 flags.go:64] FLAG: --vmodule="" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346243 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.346246 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347144 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347149 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347152 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347155 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347158 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347161 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347164 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:42.352552 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347167 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347169 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347172 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347174 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347178 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347180 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347183 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347185 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347188 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347190 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347193 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347196 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347199 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347201 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347205 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347209 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347211 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347214 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347217 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347219 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:42.353164 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347221 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347237 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347240 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347242 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347245 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347247 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347250 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347252 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347254 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347257 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347260 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347263 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347265 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347268 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347270 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347273 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347276 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347279 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347282 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347284 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:42.353688 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347287 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347290 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347293 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347295 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347298 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347300 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347303 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347306 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347308 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347311 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347313 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347315 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347318 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347321 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347323 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347326 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347328 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347331 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347333 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347337 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:42.354189 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347340 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347343 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347346 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347349 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347352 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347355 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347358 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347361 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347364 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347367 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347370 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347373 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347375 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347378 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347381 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347383 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347386 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347388 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:42.354692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.347391 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:42.355136 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.348030 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:42.355136 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.355051 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:20:42.355136 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.355068 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:20:42.355236 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355137 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:42.355236 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355143 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:42.355236 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355147 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:42.355236 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355150 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355279 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355287 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355291 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355294 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355298 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355301 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355303 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355307 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355312 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355317 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355321 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355326 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355330 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355342 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:42.355377 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.355346 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:42.356611 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356575 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:42.356611 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356613 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356619 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356626 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356637 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356641 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356647 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356650 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356654 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356656 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356659 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356666 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356668 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356671 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356674 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356677 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356695 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:42.356706 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356698 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356798 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356801 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356804 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356807 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356811 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356813 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356816 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356819 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356822 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356824 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356827 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356829 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356832 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356835 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356837 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356840 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356842 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356845 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356848 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:42.357095 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356850 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356853 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356855 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356858 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356860 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356862 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356865 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356869 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356873 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356876 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356879 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356882 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356884 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356887 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356891 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356894 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356897 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356899 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356902 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356904 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:42.357692 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356907 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356910 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356913 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356915 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356917 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356922 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356926 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356929 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356932 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.356934 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.356940 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357044 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357049 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357052 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357054 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:42.358174 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357057 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357059 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357062 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357064 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357067 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357069 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357072 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357074 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357077 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357079 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357082 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357084 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357087 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357090 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357093 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357098 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357101 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357103 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357106 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357108 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:42.358570 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357111 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357113 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357116 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357118 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357120 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357123 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357125 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357127 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357130 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357133 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357135 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357138 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357140 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357143 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357145 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357147 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357150 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357152 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357156 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:42.359058 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357159 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357162 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357165 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357168 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357170 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357173 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357176 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357179 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357181 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357183 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357186 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357188 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357191 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357193 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357195 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357198 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357200 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357203 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357206 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357209 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:42.359534 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357211 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357213 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357216 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357218 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357220 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357239 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357243 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357246 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357249 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357251 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357254 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357256 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357258 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357261 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357264 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357266 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357269 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357272 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357275 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357277 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:42.360016 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357280 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:42.360511 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357282 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:42.360511 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:42.357285 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:42.360511 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.357290 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:42.360511 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.358132 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:20:42.360511 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.360184 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:20:42.361326 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.361261 2578 server.go:1019] "Starting client certificate rotation" Apr 22 18:20:42.361437 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.361366 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:42.362485 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.362470 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:42.391690 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.391665 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:42.396830 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.396811 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:42.420118 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.420092 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:42.420251 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.420157 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:20:42.426749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.426732 2578 log.go:25] "Validated CRI v1 image API" Apr 22 18:20:42.428107 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.428087 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:20:42.430763 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.430744 2578 fs.go:135] Filesystem UUIDs: map[4bf0d16f-0744-4441-bab7-d709d51b2f7e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9b1c93ce-fa43-495d-bac2-199a436cc686:/dev/nvme0n1p4] Apr 22 18:20:42.430816 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.430763 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:20:42.437307 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.437171 2578 manager.go:217] Machine: {Timestamp:2026-04-22 18:20:42.435758054 +0000 UTC m=+0.470632155 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102479 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bc671b9d1233565e3bf241223dbfb SystemUUID:ec2bc671-b9d1-2335-65e3-bf241223dbfb BootID:4373926a-e6e9-4e62-9854-10e8ae256da9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:66:aa:9a:e8:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:66:aa:9a:e8:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:ca:0c:bd:ab:02 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:20:42.437307 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.437295 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:20:42.437447 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.437379 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:20:42.439662 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.439632 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:20:42.439805 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.439666 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-248.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:20:42.439851 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.439815 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:20:42.439851 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.439824 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:20:42.439851 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.439837 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:42.440640 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.440629 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:42.441587 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.441577 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:42.441855 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.441846 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:20:42.443253 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.443220 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9cdwg" Apr 22 18:20:42.444715 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.444704 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:20:42.444749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.444719 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:20:42.444749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.444732 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:20:42.444749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.444741 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:20:42.444749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.444750 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:20:42.446053 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.446039 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:42.446090 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.446066 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:42.449638 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.449625 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:20:42.451242 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.451005 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9cdwg" Apr 22 18:20:42.451453 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.451440 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:20:42.453126 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453108 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:20:42.453170 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453140 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:20:42.453170 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453152 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:20:42.453170 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453163 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453175 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453189 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453201 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453213 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453248 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:20:42.453267 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453261 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:20:42.453421 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453291 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:20:42.453421 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.453310 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:20:42.454117 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.454107 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:20:42.454155 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.454119 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:20:42.457490 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.457472 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:42.457573 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.457551 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:42.458093 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.458081 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:20:42.458144 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.458117 2578 server.go:1295] "Started kubelet" Apr 22 18:20:42.458201 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.458180 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:20:42.458321 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.458286 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:20:42.458372 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.458337 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:20:42.459134 ip-10-0-128-248 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:20:42.459471 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.459454 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:20:42.460723 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.460706 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-248.ec2.internal" not found Apr 22 18:20:42.461765 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.461751 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:20:42.466038 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.466016 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:42.466588 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.466567 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:20:42.467382 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467363 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:20:42.467382 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467386 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:20:42.467519 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467396 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:20:42.467519 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467409 2578 factory.go:55] Registering systemd factory Apr 22 18:20:42.467519 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467419 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:20:42.467519 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.467361 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:20:42.468579 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.468547 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-248.ec2.internal\" not found" Apr 22 18:20:42.468713 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468694 2578 factory.go:153] Registering CRI-O factory Apr 22 18:20:42.468765 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468719 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 18:20:42.468765 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468743 2578 factory.go:103] Registering Raw factory Apr 22 18:20:42.468765 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468758 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 18:20:42.468961 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468941 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:42.469131 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.468559 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:20:42.469168 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.469137 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:20:42.469409 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.469396 2578 manager.go:319] Starting recovery of all containers Apr 22 18:20:42.474382 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.474336 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:20:42.474382 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.474361 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-248.ec2.internal\" not found" node="ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.479061 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.478923 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-248.ec2.internal" not found Apr 22 18:20:42.481750 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.481733 2578 manager.go:324] Recovery completed Apr 22 18:20:42.482985 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.482965 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:20:42.486017 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.486006 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:42.487914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.487900 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:42.488002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.487930 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:42.488002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.487942 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:42.488486 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.488471 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:20:42.488549 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.488487 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:20:42.488549 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.488504 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:42.491445 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.491433 2578 policy_none.go:49] "None policy: Start" Apr 22 18:20:42.491488 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.491449 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:20:42.491488 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.491458 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:20:42.536408 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.536383 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-248.ec2.internal" not found Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.537792 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.537820 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.537830 2578 server.go:85] "Starting device plugin registration server" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.538041 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.538051 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.538131 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.538196 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.538207 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.538747 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:20:42.545602 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.538776 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-248.ec2.internal\" not found" Apr 22 18:20:42.631003 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.630908 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:20:42.632292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.632273 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:20:42.632383 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.632302 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:20:42.632383 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.632328 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:20:42.632383 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.632337 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:20:42.632383 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.632380 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:20:42.635982 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.635961 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:42.638550 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.638538 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:42.639330 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.639316 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:42.639405 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.639350 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:42.639405 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.639366 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:42.639405 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.639402 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.648333 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.648318 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.648410 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:42.648341 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-248.ec2.internal\": node \"ip-10-0-128-248.ec2.internal\" not found" Apr 22 18:20:42.733410 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.733373 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal"] Apr 22 18:20:42.736029 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.736004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.736029 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.736019 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.759926 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.759906 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.764531 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.764517 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.770761 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.770743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.770823 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.770770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.770823 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.770812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.775689 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.775674 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:42.775769 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.775673 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:42.871892 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.871892 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.872047 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.872047 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d579a59ca2c3bf4b3f744c41961ff1e1-config\") pod \"kube-apiserver-proxy-ip-10-0-128-248.ec2.internal\" (UID: \"d579a59ca2c3bf4b3f744c41961ff1e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.872047 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:42.872047 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:42.871982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/933ffdfca6e87b798592801ce6979396-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal\" (UID: \"933ffdfca6e87b798592801ce6979396\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:43.077403 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.077329 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" Apr 22 18:20:43.078707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.078688 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" Apr 22 18:20:43.362476 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.362391 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:20:43.362968 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.362550 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:43.362968 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.362557 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:43.362968 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.362606 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:43.445748 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.445706 2578 apiserver.go:52] "Watching apiserver" Apr 22 18:20:43.452933 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.452890 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:15:42 +0000 UTC" deadline="2027-09-30 01:36:23.27896093 +0000 UTC" Apr 22 18:20:43.452933 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.452928 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12607h15m39.826036508s" Apr 22 18:20:43.453914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.453896 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:20:43.454263 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.454221 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dthlh","openshift-network-diagnostics/network-check-target-jftnj","openshift-ovn-kubernetes/ovnkube-node-fnkv4","kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk","openshift-cluster-node-tuning-operator/tuned-9995t","openshift-dns/node-resolver-q979r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal","openshift-multus/multus-additional-cni-plugins-fr8p8","openshift-multus/network-metrics-daemon-zx8mc","openshift-network-operator/iptables-alerter-v7r7x","kube-system/konnectivity-agent-jgjjg","openshift-image-registry/node-ca-ll75d"] Apr 22 18:20:43.457152 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.457132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.457278 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.457210 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:43.457343 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.457321 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:43.458540 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.458518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.459665 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.459648 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.460027 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460006 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.460103 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460040 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.460103 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460064 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:20:43.460103 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460064 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:20:43.460279 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460108 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-297ng\"" Apr 22 18:20:43.460892 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.461012 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.460985 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:20:43.461385 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.461361 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.461495 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.461410 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:20:43.461495 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.461427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:20:43.461495 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.461479 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.462097 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.462064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463409 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463472 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h9dx4\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463511 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463573 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463600 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qk7fh\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463412 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.463707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wktdg\"" Apr 22 18:20:43.463979 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:20:43.463979 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.463726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.464456 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.464440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.464456 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.464448 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.464581 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.464448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cd65k\"" Apr 22 18:20:43.464779 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.464766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.464855 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.464838 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:43.466021 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.466006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.466064 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.466027 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:20:43.466100 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.466084 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:43.466348 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.466290 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:20:43.466440 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.466382 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-48pbs\"" Apr 22 18:20:43.467447 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.467432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.468073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.468058 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.468347 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.468327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.468347 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.468346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vmtz5\"" Apr 22 18:20:43.468464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.468385 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:20:43.468712 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.468695 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.469814 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.469788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:20:43.469914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.469882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d6wz6\"" Apr 22 18:20:43.469914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.469891 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:20:43.470972 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.470957 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:20:43.472125 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.471631 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:20:43.472125 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.471664 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:20:43.472316 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.472299 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r6fxt\"" Apr 22 18:20:43.473689 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-kubelet\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.473777 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.473777 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-sys\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.473877 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-var-lib-kubelet\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.473877 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-cnibin\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.473877 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-os-release\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473873 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv49\" (UniqueName: \"kubernetes.io/projected/65f22fed-3bb7-44e7-912c-3b5733a79f43-kube-api-access-gsv49\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-host\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkn7\" (UniqueName: \"kubernetes.io/projected/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kube-api-access-4lkn7\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.473983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-system-cni-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.474075 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9frj\" (UniqueName: \"kubernetes.io/projected/94e54bd0-5757-4951-b6b5-0ae58070a297-kube-api-access-m9frj\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4acd2902-b094-4f0a-a902-526fdb5ba7de-agent-certs\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-bin\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-config\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4acd2902-b094-4f0a-a902-526fdb5ba7de-konnectivity-ca\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-conf-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-multus-daemon-config\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.474371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-env-overrides\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.474651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovn-node-metrics-cert\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.474651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.474651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcb4v\" (UniqueName: \"kubernetes.io/projected/9815cd85-4c2f-43b2-97a8-f65c3f26db10-kube-api-access-fcb4v\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.474651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.474651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-system-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.474848 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-k8s-cni-cncf-io\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475012 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.474711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-netns\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-node-log\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.475200 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-multus-certs\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475200 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-run\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.475200 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-systemd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475200 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-modprobe-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-device-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-slash\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-script-lib\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475387 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-cni-binary-copy\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475464 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-registration-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-tmp-dir\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-binary-copy\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-var-lib-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-conf\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-systemd\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-bin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-systemd-units\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-netns\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.475776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9815cd85-4c2f-43b2-97a8-f65c3f26db10-host-slash\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2zl\" (UniqueName: \"kubernetes.io/projected/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-kube-api-access-wm2zl\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-host\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475880 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-ovn\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-netd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9815cd85-4c2f-43b2-97a8-f65c3f26db10-iptables-alerter-script\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-serviceca\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-socket-dir-parent\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.475994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-multus\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-etc-kubernetes\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86db\" (UniqueName: \"kubernetes.io/projected/bc302022-8ada-4b98-9ede-97c644115fcc-kube-api-access-k86db\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-lib-modules\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-socket-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-hosts-file\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-cnibin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-hostroot\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.476808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-log-socket\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-tmp\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-etc-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysconfig\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-kubernetes\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtrtg\" (UniqueName: \"kubernetes.io/projected/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-kube-api-access-wtrtg\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476415 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n564\" (UniqueName: \"kubernetes.io/projected/7d04757d-f880-4389-bdfa-265c70b5d789-kube-api-access-2n564\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-kubelet\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-tuned\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476490 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-sys-fs\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-os-release\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.477295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.476545 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7snq\" (UniqueName: \"kubernetes.io/projected/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-kube-api-access-d7snq\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.480545 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.480525 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:43.507293 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.507275 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2mqdg" Apr 22 18:20:43.515980 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.515958 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2mqdg" Apr 22 18:20:43.568266 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.568244 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:20:43.576715 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-socket-dir-parent\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.576818 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-multus\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.576818 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-etc-kubernetes\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.576818 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k86db\" (UniqueName: \"kubernetes.io/projected/bc302022-8ada-4b98-9ede-97c644115fcc-kube-api-access-k86db\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.576818 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-lib-modules\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-etc-kubernetes\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-socket-dir-parent\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-multus\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-socket-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.576961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-lib-modules\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577026 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-hosts-file\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-cnibin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-hostroot\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-hosts-file\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-socket-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-log-socket\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-cnibin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-hostroot\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-tmp\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-log-socket\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577257 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-etc-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysconfig\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-etc-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-kubernetes\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtrtg\" (UniqueName: \"kubernetes.io/projected/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-kube-api-access-wtrtg\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577387 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n564\" (UniqueName: \"kubernetes.io/projected/7d04757d-f880-4389-bdfa-265c70b5d789-kube-api-access-2n564\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-kubernetes\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysconfig\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-kubelet\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-tuned\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-sys-fs\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577565 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-os-release\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577590 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:20:43.577639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-kubelet\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-sys-fs\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-os-release\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7snq\" (UniqueName: \"kubernetes.io/projected/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-kube-api-access-d7snq\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-kubelet\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-sys\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-kubelet\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-var-lib-kubelet\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-ovn-kubernetes\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-cnibin\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-os-release\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv49\" (UniqueName: \"kubernetes.io/projected/65f22fed-3bb7-44e7-912c-3b5733a79f43-kube-api-access-gsv49\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-var-lib-kubelet\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-sys\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.577990 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-host\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-os-release\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.578292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-host\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-cnibin\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lkn7\" (UniqueName: \"kubernetes.io/projected/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kube-api-access-4lkn7\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-system-cni-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9frj\" (UniqueName: \"kubernetes.io/projected/94e54bd0-5757-4951-b6b5-0ae58070a297-kube-api-access-m9frj\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4acd2902-b094-4f0a-a902-526fdb5ba7de-agent-certs\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.578242 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578250 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-bin\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-config\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4acd2902-b094-4f0a-a902-526fdb5ba7de-konnectivity-ca\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-conf-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-multus-daemon-config\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.578494 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:44.078428015 +0000 UTC m=+2.113302121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-system-cni-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.579166 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.578763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-bin\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-conf-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4acd2902-b094-4f0a-a902-526fdb5ba7de-konnectivity-ca\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-env-overrides\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovn-node-metrics-cert\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcb4v\" (UniqueName: \"kubernetes.io/projected/9815cd85-4c2f-43b2-97a8-f65c3f26db10-kube-api-access-fcb4v\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.579832 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-system-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-k8s-cni-cncf-io\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-netns\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.579956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-node-log\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-multus-certs\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-run\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.580205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-systemd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.580751 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-netns\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.580751 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-modprobe-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.580751 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-env-overrides\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.580992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-modprobe-d\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-system-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-run\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/94e54bd0-5757-4951-b6b5-0ae58070a297-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581146 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-systemd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-node-log\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-k8s-cni-cncf-io\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-config\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-multus-daemon-config\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.581516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-tuned\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-tmp\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-device-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-slash\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-script-lib\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-run-multus-certs\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-device-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581815 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-multus-cni-dir\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.581975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-slash\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.582042 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-cni-binary-copy\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.582698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-registration-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4acd2902-b094-4f0a-a902-526fdb5ba7de-agent-certs\") pod \"konnectivity-agent-jgjjg\" (UID: \"4acd2902-b094-4f0a-a902-526fdb5ba7de\") " pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.582698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-tmp-dir\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.582698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-binary-copy\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.582950 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc302022-8ada-4b98-9ede-97c644115fcc-cni-binary-copy\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.582950 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-registration-dir\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.582950 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovnkube-script-lib\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.582950 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.583158 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.582985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-var-lib-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.583158 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583071 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-tmp-dir\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.583295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-conf\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.583295 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-systemd\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.583401 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-bin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.583455 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:43.583550 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-cni-binary-copy\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-systemd-units\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-etc-selinux\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-netns\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-systemd-units\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-run-netns\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-sysctl-conf\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9815cd85-4c2f-43b2-97a8-f65c3f26db10-host-slash\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm2zl\" (UniqueName: \"kubernetes.io/projected/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-kube-api-access-wm2zl\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.583782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-etc-systemd\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.584426 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/94e54bd0-5757-4951-b6b5-0ae58070a297-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584610 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/65f22fed-3bb7-44e7-912c-3b5733a79f43-ovn-node-metrics-cert\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-host\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-ovn\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-netd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9815cd85-4c2f-43b2-97a8-f65c3f26db10-iptables-alerter-script\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-serviceca\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.584994 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-host\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-ovn\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.584944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-run-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.585044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-host-cni-netd\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.585086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9815cd85-4c2f-43b2-97a8-f65c3f26db10-host-slash\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.585137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc302022-8ada-4b98-9ede-97c644115fcc-host-var-lib-cni-bin\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.585355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.585191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/65f22fed-3bb7-44e7-912c-3b5733a79f43-var-lib-openvswitch\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.585639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.585561 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-serviceca\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.586581 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.586160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9815cd85-4c2f-43b2-97a8-f65c3f26db10-iptables-alerter-script\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.586932 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.586907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtrtg\" (UniqueName: \"kubernetes.io/projected/41b3e8f6-4f69-4847-9f45-4c588f6bfd45-kube-api-access-wtrtg\") pod \"tuned-9995t\" (UID: \"41b3e8f6-4f69-4847-9f45-4c588f6bfd45\") " pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.587368 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.587312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7snq\" (UniqueName: \"kubernetes.io/projected/bf66aaa8-f33b-4fa8-8d81-ede023dd01d9-kube-api-access-d7snq\") pod \"node-ca-ll75d\" (UID: \"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9\") " pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.588011 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.587988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86db\" (UniqueName: \"kubernetes.io/projected/bc302022-8ada-4b98-9ede-97c644115fcc-kube-api-access-k86db\") pod \"multus-dthlh\" (UID: \"bc302022-8ada-4b98-9ede-97c644115fcc\") " pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.588194 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.588170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n564\" (UniqueName: \"kubernetes.io/projected/7d04757d-f880-4389-bdfa-265c70b5d789-kube-api-access-2n564\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:43.589534 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.589505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lkn7\" (UniqueName: \"kubernetes.io/projected/5fecef3b-c5b2-49d1-8d71-e96e5433fb1a-kube-api-access-4lkn7\") pod \"aws-ebs-csi-driver-node-pg5qk\" (UID: \"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.589811 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.589793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv49\" (UniqueName: \"kubernetes.io/projected/65f22fed-3bb7-44e7-912c-3b5733a79f43-kube-api-access-gsv49\") pod \"ovnkube-node-fnkv4\" (UID: \"65f22fed-3bb7-44e7-912c-3b5733a79f43\") " pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.590379 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.590358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9frj\" (UniqueName: \"kubernetes.io/projected/94e54bd0-5757-4951-b6b5-0ae58070a297-kube-api-access-m9frj\") pod \"multus-additional-cni-plugins-fr8p8\" (UID: \"94e54bd0-5757-4951-b6b5-0ae58070a297\") " pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.590745 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.590728 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:43.590745 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.590747 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:43.590869 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.590756 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:43.590869 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:43.590810 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:44.090794523 +0000 UTC m=+2.125668630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:43.591668 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.591636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcb4v\" (UniqueName: \"kubernetes.io/projected/9815cd85-4c2f-43b2-97a8-f65c3f26db10-kube-api-access-fcb4v\") pod \"iptables-alerter-v7r7x\" (UID: \"9815cd85-4c2f-43b2-97a8-f65c3f26db10\") " pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.593857 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.593834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm2zl\" (UniqueName: \"kubernetes.io/projected/8468d537-923d-4e8a-a4bc-66ddc1d8ca1d-kube-api-access-wm2zl\") pod \"node-resolver-q979r\" (UID: \"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d\") " pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.624315 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.624261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:20:43.630729 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.630708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ll75d" Apr 22 18:20:43.693614 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.693573 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd579a59ca2c3bf4b3f744c41961ff1e1.slice/crio-f718d6565149a1b398fee85b6c587fa851c3181431dcaa954db709023d33e0eb WatchSource:0}: Error finding container f718d6565149a1b398fee85b6c587fa851c3181431dcaa954db709023d33e0eb: Status 404 returned error can't find the container with id f718d6565149a1b398fee85b6c587fa851c3181431dcaa954db709023d33e0eb Apr 22 18:20:43.693889 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.693870 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933ffdfca6e87b798592801ce6979396.slice/crio-42e1a693da6986ba54528eb0e5ca0bf548ff4f2d427bc357c16a9cfddb390799 WatchSource:0}: Error finding container 42e1a693da6986ba54528eb0e5ca0bf548ff4f2d427bc357c16a9cfddb390799: Status 404 returned error can't find the container with id 42e1a693da6986ba54528eb0e5ca0bf548ff4f2d427bc357c16a9cfddb390799 Apr 22 18:20:43.699135 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.699118 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:20:43.780913 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.780881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dthlh" Apr 22 18:20:43.787080 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.787053 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc302022_8ada_4b98_9ede_97c644115fcc.slice/crio-e74fbfde287f43590332e30105547d61b2d072e30e96594a86731d4f1d42b86d WatchSource:0}: Error finding container e74fbfde287f43590332e30105547d61b2d072e30e96594a86731d4f1d42b86d: Status 404 returned error can't find the container with id e74fbfde287f43590332e30105547d61b2d072e30e96594a86731d4f1d42b86d Apr 22 18:20:43.802324 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.802305 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:20:43.808617 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.808591 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f22fed_3bb7_44e7_912c_3b5733a79f43.slice/crio-1acdbf1e55ae95d63b38f92340d0c4adf68bb7f329ca45ced6dcd4c741af33cf WatchSource:0}: Error finding container 1acdbf1e55ae95d63b38f92340d0c4adf68bb7f329ca45ced6dcd4c741af33cf: Status 404 returned error can't find the container with id 1acdbf1e55ae95d63b38f92340d0c4adf68bb7f329ca45ced6dcd4c741af33cf Apr 22 18:20:43.819132 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.819112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" Apr 22 18:20:43.825099 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.825069 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fecef3b_c5b2_49d1_8d71_e96e5433fb1a.slice/crio-db120ccd004014e0c9bce5dd59b3e156bef6a5caed0395a051fc80306c0740ff WatchSource:0}: Error finding container db120ccd004014e0c9bce5dd59b3e156bef6a5caed0395a051fc80306c0740ff: Status 404 returned error can't find the container with id db120ccd004014e0c9bce5dd59b3e156bef6a5caed0395a051fc80306c0740ff Apr 22 18:20:43.839997 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.839971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9995t" Apr 22 18:20:43.846411 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.846381 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b3e8f6_4f69_4847_9f45_4c588f6bfd45.slice/crio-f18342e3079ed1061ad7f429ec08c4c358121cfd915a8f5db3d69439f798d582 WatchSource:0}: Error finding container f18342e3079ed1061ad7f429ec08c4c358121cfd915a8f5db3d69439f798d582: Status 404 returned error can't find the container with id f18342e3079ed1061ad7f429ec08c4c358121cfd915a8f5db3d69439f798d582 Apr 22 18:20:43.857183 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.857166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q979r" Apr 22 18:20:43.863028 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.863012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" Apr 22 18:20:43.865376 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.865350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8468d537_923d_4e8a_a4bc_66ddc1d8ca1d.slice/crio-28ffe166efb8c9777ed0323496809193ee4d436d749938bcbff8ca87dec3c186 WatchSource:0}: Error finding container 28ffe166efb8c9777ed0323496809193ee4d436d749938bcbff8ca87dec3c186: Status 404 returned error can't find the container with id 28ffe166efb8c9777ed0323496809193ee4d436d749938bcbff8ca87dec3c186 Apr 22 18:20:43.869056 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.869035 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e54bd0_5757_4951_b6b5_0ae58070a297.slice/crio-5ee9cb8018b06759ef67ebdfd16205b2e9636e6fd3fffc6f518e37fdcc6f16f9 WatchSource:0}: Error finding container 5ee9cb8018b06759ef67ebdfd16205b2e9636e6fd3fffc6f518e37fdcc6f16f9: Status 404 returned error can't find the container with id 5ee9cb8018b06759ef67ebdfd16205b2e9636e6fd3fffc6f518e37fdcc6f16f9 Apr 22 18:20:43.890656 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:43.890602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v7r7x" Apr 22 18:20:43.896764 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.896743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9815cd85_4c2f_43b2_97a8_f65c3f26db10.slice/crio-16bdc362a35dc58e59048d7b5578e2d548b1e950bfb5dd2dcb5916d5d0f5a1f9 WatchSource:0}: Error finding container 16bdc362a35dc58e59048d7b5578e2d548b1e950bfb5dd2dcb5916d5d0f5a1f9: Status 404 returned error can't find the container with id 16bdc362a35dc58e59048d7b5578e2d548b1e950bfb5dd2dcb5916d5d0f5a1f9 Apr 22 18:20:43.965677 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:43.965657 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf66aaa8_f33b_4fa8_8d81_ede023dd01d9.slice/crio-85d7ea094e9d4c221737d7344385273a1a120f04ac952ec703117649ead42104 WatchSource:0}: Error finding container 85d7ea094e9d4c221737d7344385273a1a120f04ac952ec703117649ead42104: Status 404 returned error can't find the container with id 85d7ea094e9d4c221737d7344385273a1a120f04ac952ec703117649ead42104 Apr 22 18:20:44.089027 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.088987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:44.089186 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.089136 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:44.089266 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.089207 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:45.089187943 +0000 UTC m=+3.124062034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:44.189666 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.189538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:44.189821 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.189710 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:44.189821 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.189733 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:44.189821 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.189746 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:44.189982 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:44.189809 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:45.189784685 +0000 UTC m=+3.224658796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:44.306613 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:20:44.306407 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4acd2902_b094_4f0a_a902_526fdb5ba7de.slice/crio-6e917eb308bc9588fc5a73ed78849184af1d34ce244c45232c45acad50e6c0a7 WatchSource:0}: Error finding container 6e917eb308bc9588fc5a73ed78849184af1d34ce244c45232c45acad50e6c0a7: Status 404 returned error can't find the container with id 6e917eb308bc9588fc5a73ed78849184af1d34ce244c45232c45acad50e6c0a7 Apr 22 18:20:44.389560 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.389010 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:44.392349 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.392317 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:44.518185 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.518089 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:43 +0000 UTC" deadline="2027-09-16 21:56:22.515343408 +0000 UTC" Apr 22 18:20:44.518185 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.518139 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12291h35m37.99721024s" Apr 22 18:20:44.655731 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.655664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7r7x" event={"ID":"9815cd85-4c2f-43b2-97a8-f65c3f26db10","Type":"ContainerStarted","Data":"16bdc362a35dc58e59048d7b5578e2d548b1e950bfb5dd2dcb5916d5d0f5a1f9"} Apr 22 18:20:44.663747 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.663714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q979r" event={"ID":"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d","Type":"ContainerStarted","Data":"28ffe166efb8c9777ed0323496809193ee4d436d749938bcbff8ca87dec3c186"} Apr 22 18:20:44.671712 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.671683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" event={"ID":"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a","Type":"ContainerStarted","Data":"db120ccd004014e0c9bce5dd59b3e156bef6a5caed0395a051fc80306c0740ff"} Apr 22 18:20:44.680019 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.679992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"1acdbf1e55ae95d63b38f92340d0c4adf68bb7f329ca45ced6dcd4c741af33cf"} Apr 22 18:20:44.695522 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.695315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" event={"ID":"d579a59ca2c3bf4b3f744c41961ff1e1","Type":"ContainerStarted","Data":"f718d6565149a1b398fee85b6c587fa851c3181431dcaa954db709023d33e0eb"} Apr 22 18:20:44.711818 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.711780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jgjjg" event={"ID":"4acd2902-b094-4f0a-a902-526fdb5ba7de","Type":"ContainerStarted","Data":"6e917eb308bc9588fc5a73ed78849184af1d34ce244c45232c45acad50e6c0a7"} Apr 22 18:20:44.730923 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.730889 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ll75d" event={"ID":"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9","Type":"ContainerStarted","Data":"85d7ea094e9d4c221737d7344385273a1a120f04ac952ec703117649ead42104"} Apr 22 18:20:44.733839 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.733789 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerStarted","Data":"5ee9cb8018b06759ef67ebdfd16205b2e9636e6fd3fffc6f518e37fdcc6f16f9"} Apr 22 18:20:44.757189 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.757162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9995t" event={"ID":"41b3e8f6-4f69-4847-9f45-4c588f6bfd45","Type":"ContainerStarted","Data":"f18342e3079ed1061ad7f429ec08c4c358121cfd915a8f5db3d69439f798d582"} Apr 22 18:20:44.764049 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.764021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dthlh" event={"ID":"bc302022-8ada-4b98-9ede-97c644115fcc","Type":"ContainerStarted","Data":"e74fbfde287f43590332e30105547d61b2d072e30e96594a86731d4f1d42b86d"} Apr 22 18:20:44.780309 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.780190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerStarted","Data":"42e1a693da6986ba54528eb0e5ca0bf548ff4f2d427bc357c16a9cfddb390799"} Apr 22 18:20:44.816476 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:44.816447 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:45.096639 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.096551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:45.096849 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.096699 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:45.096849 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.096766 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:47.096747581 +0000 UTC m=+5.131621688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:45.197960 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.197920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:45.205192 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.202559 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:45.205192 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.202593 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:45.205192 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.202615 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:45.205192 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.202692 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:47.202663879 +0000 UTC m=+5.237537983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:45.519403 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.519279 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:43 +0000 UTC" deadline="2027-11-30 23:02:07.471002455 +0000 UTC" Apr 22 18:20:45.519403 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.519321 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14092h41m21.951685741s" Apr 22 18:20:45.633450 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.633419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:45.633694 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.633591 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:45.633794 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:45.633772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:45.633874 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:45.633855 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:47.114211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:47.114168 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:47.114670 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.114343 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:47.114670 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.114397 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:51.114381883 +0000 UTC m=+9.149255970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:47.215570 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:47.215499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:47.215733 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.215691 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:47.215733 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.215718 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:47.215733 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.215732 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:47.215881 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.215797 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:51.21577593 +0000 UTC m=+9.250650041 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:47.633586 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:47.633556 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:47.633778 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.633688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:47.634066 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:47.634045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:47.634160 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:47.634141 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:49.633590 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:49.633470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:49.633590 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:49.633510 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:49.634091 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:49.633636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:49.634091 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:49.633771 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:51.148767 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.148178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:51.148767 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.148316 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:51.148767 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.148385 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.148364971 +0000 UTC m=+17.183239059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:51.249306 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.249272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:51.249479 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.249431 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:51.249479 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.249449 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:51.249479 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.249460 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:51.249629 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.249516 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.249495849 +0000 UTC m=+17.284369955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:51.633063 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.632973 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:51.633264 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.632973 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:51.633264 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.633116 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:51.633264 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.633215 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:51.691941 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.691908 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gtjmt"] Apr 22 18:20:51.694943 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.694922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.695074 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.695000 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:20:51.753537 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.753371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.753537 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.753423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-dbus\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.753537 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.753493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-kubelet-config\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.854676 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.854640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-kubelet-config\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.854856 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.854702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.854856 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.854724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-dbus\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.854967 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.854879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-dbus\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.854967 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:51.854921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb007119-ecfd-4e25-867d-4c7cfb97c10b-kubelet-config\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:51.855069 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.855002 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:51.855069 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:51.855043 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:20:52.355029673 +0000 UTC m=+10.389903760 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:52.358826 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:52.358790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:52.359252 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:52.358944 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:52.359252 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:52.358999 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:20:53.358981653 +0000 UTC m=+11.393855746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:53.367132 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:53.367092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:53.367621 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:53.367284 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:53.367621 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:53.367366 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:20:55.367345838 +0000 UTC m=+13.402219939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:53.633584 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:53.633503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:53.633584 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:53.633570 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:53.633816 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:53.633687 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:53.633980 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:53.633938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:53.634103 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:53.634050 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:20:53.634173 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:53.634119 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:55.386211 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:55.386165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:55.386682 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:55.386328 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:55.386682 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:55.386404 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.386385512 +0000 UTC m=+17.421259613 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:55.632610 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:55.632573 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:55.632801 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:55.632573 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:55.632801 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:55.632694 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:20:55.632911 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:55.632815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:55.632911 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:55.632575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:55.632997 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:55.632934 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:57.633470 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:57.633438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:57.633875 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:57.633474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:57.633875 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:57.633574 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:20:57.633875 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:57.633623 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:57.633875 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:57.633663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:57.633875 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:57.633777 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:59.216625 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.216581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:59.217080 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.216742 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:59.217080 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.216816 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:15.216794212 +0000 UTC m=+33.251668302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:59.317181 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.317141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:59.317359 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.317336 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:59.317403 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.317362 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:59.317403 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.317374 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:59.317483 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.317427 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:15.317412232 +0000 UTC m=+33.352286319 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:59.417904 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.417857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:59.418070 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.417987 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:59.418070 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.418061 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:21:07.418042978 +0000 UTC m=+25.452917070 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:59.633259 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.633156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:20:59.633438 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.633156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:20:59.633438 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.633312 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:20:59.633438 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.633333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:20:59.633438 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:20:59.633155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:20:59.633630 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:20:59.633467 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:01.633122 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:01.633084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:01.633624 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:01.633190 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:01.633624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:01.633209 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:01.633624 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:01.633247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:01.633624 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:01.633329 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:01.633624 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:01.633404 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:02.822695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.822432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" event={"ID":"d579a59ca2c3bf4b3f744c41961ff1e1","Type":"ContainerStarted","Data":"afa5c27ccb1954bff44ac510d62b5a6bfeb4787b13745d644984d3b95f77b81e"} Apr 22 18:21:02.823860 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.823838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9995t" event={"ID":"41b3e8f6-4f69-4847-9f45-4c588f6bfd45","Type":"ContainerStarted","Data":"d7f6ab1090dcb6ff6e19050ab48594324a84d5c5f85703356738c6bab82de850"} Apr 22 18:21:02.829258 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.829213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dthlh" event={"ID":"bc302022-8ada-4b98-9ede-97c644115fcc","Type":"ContainerStarted","Data":"dab800f25d2efa7c38d778e0a0ab1066012f4cc6198ccf41a68562676605212c"} Apr 22 18:21:02.839976 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.839932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"e8b794e16acc5165ae2d3854be6da1890c6cb6bcc0bc9cce1df9045821609e89"} Apr 22 18:21:02.839976 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.839966 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"02a694812dad6576a7ff35449c1bfe8901b91788fc7453230aea87fefabe56d6"} Apr 22 18:21:02.839976 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.839979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"dbc5a8f14ac180c662e2c094b374aaedeca7654cd63825e39e6a0c0b2f69f5a8"} Apr 22 18:21:02.847160 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.847115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-248.ec2.internal" podStartSLOduration=20.847103204 podStartE2EDuration="20.847103204s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:02.84663561 +0000 UTC m=+20.881509719" watchObservedRunningTime="2026-04-22 18:21:02.847103204 +0000 UTC m=+20.881977314" Apr 22 18:21:02.866856 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:02.866814 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dthlh" podStartSLOduration=2.479912797 podStartE2EDuration="20.866802805s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.788662023 +0000 UTC m=+1.823536111" lastFinishedPulling="2026-04-22 18:21:02.175552031 +0000 UTC m=+20.210426119" observedRunningTime="2026-04-22 18:21:02.866593934 +0000 UTC m=+20.901468046" watchObservedRunningTime="2026-04-22 18:21:02.866802805 +0000 UTC m=+20.901676938" Apr 22 18:21:03.633601 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.633567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:03.633601 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.633581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:03.633601 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.633567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:03.633890 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:03.633695 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:03.633890 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:03.633811 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:03.633972 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:03.633921 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:03.843619 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.843496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jgjjg" event={"ID":"4acd2902-b094-4f0a-a902-526fdb5ba7de","Type":"ContainerStarted","Data":"bba0bf852541fbd3d7fe44eecbbec21255cc3899f55d5e5548979e99366e0134"} Apr 22 18:21:03.845749 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.845713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ll75d" event={"ID":"bf66aaa8-f33b-4fa8-8d81-ede023dd01d9","Type":"ContainerStarted","Data":"4232f9799498992d0a20670f835a126beaaa63352762033052bcee8e9604aa9a"} Apr 22 18:21:03.847165 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.847139 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="67417620ab932b960c4df7425267246015cae6af5bbb1a4c228e914ca1328fae" exitCode=0 Apr 22 18:21:03.847302 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.847212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"67417620ab932b960c4df7425267246015cae6af5bbb1a4c228e914ca1328fae"} Apr 22 18:21:03.849356 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.849329 2578 generic.go:358] "Generic (PLEG): container finished" podID="933ffdfca6e87b798592801ce6979396" containerID="09c80c3f960e15c4f71e523aa1f5eda5457515be9e29cd3eb152c9a3f19ad394" exitCode=0 Apr 22 18:21:03.849459 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.849403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerDied","Data":"09c80c3f960e15c4f71e523aa1f5eda5457515be9e29cd3eb152c9a3f19ad394"} Apr 22 18:21:03.851253 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.850829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v7r7x" event={"ID":"9815cd85-4c2f-43b2-97a8-f65c3f26db10","Type":"ContainerStarted","Data":"6af995054c4f55b5a05a4451f9f5eecfa245eedc5d1876f449b6f739f2a1cc00"} Apr 22 18:21:03.852254 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.852213 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q979r" event={"ID":"8468d537-923d-4e8a-a4bc-66ddc1d8ca1d","Type":"ContainerStarted","Data":"218993a9ec6a9f9ac1e91f2deb37dda77b03b5169b84d13a94075f95a9b45680"} Apr 22 18:21:03.853659 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.853639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" event={"ID":"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a","Type":"ContainerStarted","Data":"ab70fff1d4e4f12935121790203f6e28725a9945998687becdfd5231db26fe07"} Apr 22 18:21:03.856581 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.856549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"2ad7747b310e134efbd8f2ab7080c669eb12f902839c5b23c9d75ce3f9389b7a"} Apr 22 18:21:03.856581 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.856580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"5b59a0bbdf1f294d53bb2b8d8ba6d0152d792cc3e02f702f7161f7768b05fb33"} Apr 22 18:21:03.856713 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.856594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"d2fc785a98fa424f88fb26a17a5928bb415792b8ccf1d8d21df16b20dbe4dd7a"} Apr 22 18:21:03.861080 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.861039 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9995t" podStartSLOduration=3.652418813 podStartE2EDuration="21.861024878s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.847787558 +0000 UTC m=+1.882661647" lastFinishedPulling="2026-04-22 18:21:02.056393613 +0000 UTC m=+20.091267712" observedRunningTime="2026-04-22 18:21:02.895823517 +0000 UTC m=+20.930697628" watchObservedRunningTime="2026-04-22 18:21:03.861024878 +0000 UTC m=+21.895898989" Apr 22 18:21:03.861515 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.861483 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jgjjg" podStartSLOduration=4.112881934 podStartE2EDuration="21.861474523s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:44.307794521 +0000 UTC m=+2.342668615" lastFinishedPulling="2026-04-22 18:21:02.0563871 +0000 UTC m=+20.091261204" observedRunningTime="2026-04-22 18:21:03.860786405 +0000 UTC m=+21.895660514" watchObservedRunningTime="2026-04-22 18:21:03.861474523 +0000 UTC m=+21.896348634" Apr 22 18:21:03.876721 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.876675 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q979r" podStartSLOduration=3.576643241 podStartE2EDuration="21.876662568s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.867126132 +0000 UTC m=+1.902000220" lastFinishedPulling="2026-04-22 18:21:02.167145458 +0000 UTC m=+20.202019547" observedRunningTime="2026-04-22 18:21:03.876009409 +0000 UTC m=+21.910883521" watchObservedRunningTime="2026-04-22 18:21:03.876662568 +0000 UTC m=+21.911536680" Apr 22 18:21:03.947806 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:03.947753 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v7r7x" podStartSLOduration=3.668868072 podStartE2EDuration="21.947734632s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.89820664 +0000 UTC m=+1.933080731" lastFinishedPulling="2026-04-22 18:21:02.177073202 +0000 UTC m=+20.211947291" observedRunningTime="2026-04-22 18:21:03.9469692 +0000 UTC m=+21.981843310" watchObservedRunningTime="2026-04-22 18:21:03.947734632 +0000 UTC m=+21.982608743" Apr 22 18:21:04.113615 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.113591 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:21:04.549975 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.549790 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:21:04.113610699Z","UUID":"506a03c2-9c95-4bbb-8a63-46ed6a7ba868","Handler":null,"Name":"","Endpoint":""} Apr 22 18:21:04.551743 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.551722 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:21:04.551860 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.551754 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:21:04.860953 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.860876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" event={"ID":"933ffdfca6e87b798592801ce6979396","Type":"ContainerStarted","Data":"e26444e18e3f3def1a2cc3c98030c418b1572070f5f2fdf3dd4c029cca0176e9"} Apr 22 18:21:04.863250 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.862771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" event={"ID":"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a","Type":"ContainerStarted","Data":"6e747a96f64db228ea03513cad4ffffa680cd9b93b632465c2bd1cdbf4160c63"} Apr 22 18:21:04.880911 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.880865 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ll75d" podStartSLOduration=4.800506803 podStartE2EDuration="22.880848946s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.96727653 +0000 UTC m=+2.002150618" lastFinishedPulling="2026-04-22 18:21:02.047618668 +0000 UTC m=+20.082492761" observedRunningTime="2026-04-22 18:21:03.977306956 +0000 UTC m=+22.012181065" watchObservedRunningTime="2026-04-22 18:21:04.880848946 +0000 UTC m=+22.915723057" Apr 22 18:21:04.881507 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:04.881476 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-248.ec2.internal" podStartSLOduration=22.881467115 podStartE2EDuration="22.881467115s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:04.880990967 +0000 UTC m=+22.915865063" watchObservedRunningTime="2026-04-22 18:21:04.881467115 +0000 UTC m=+22.916341224" Apr 22 18:21:05.632807 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.632769 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:05.632807 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.632792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:05.633092 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.632882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:05.633092 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:05.632900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:05.633092 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:05.633014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:05.633270 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:05.633092 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:05.866869 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.866824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" event={"ID":"5fecef3b-c5b2-49d1-8d71-e96e5433fb1a","Type":"ContainerStarted","Data":"e7e7ee8c2e1ad18876772a8f8e2620d9c1e1040c2c58523fcb4f857b4ed7106d"} Apr 22 18:21:05.870065 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.870034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"0aa4062e252300154f69f0c3182d601e97b5414753cd7822cd852ecae0e7ce81"} Apr 22 18:21:05.892551 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:05.892446 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pg5qk" podStartSLOduration=2.783301477 podStartE2EDuration="23.892431554s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.826549996 +0000 UTC m=+1.861424084" lastFinishedPulling="2026-04-22 18:21:04.935680064 +0000 UTC m=+22.970554161" observedRunningTime="2026-04-22 18:21:05.891896844 +0000 UTC m=+23.926770953" watchObservedRunningTime="2026-04-22 18:21:05.892431554 +0000 UTC m=+23.927305664" Apr 22 18:21:06.925914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:06.925876 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:21:06.926579 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:06.926557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:21:07.475235 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.475198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:07.475411 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:07.475338 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:21:07.475411 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:07.475401 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret podName:bb007119-ecfd-4e25-867d-4c7cfb97c10b nodeName:}" failed. No retries permitted until 2026-04-22 18:21:23.475384131 +0000 UTC m=+41.510258239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret") pod "global-pull-secret-syncer-gtjmt" (UID: "bb007119-ecfd-4e25-867d-4c7cfb97c10b") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:21:07.632752 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.632723 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:07.632752 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.632735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:07.632979 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.632822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:07.632979 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:07.632828 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:07.632979 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:07.632915 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:07.633087 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:07.632990 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:07.873927 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.873885 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:21:07.874497 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:07.874475 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jgjjg" Apr 22 18:21:08.878895 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.878679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" event={"ID":"65f22fed-3bb7-44e7-912c-3b5733a79f43","Type":"ContainerStarted","Data":"3bade9e9e0b5c48c479d7c986a4427c1f9f5141da8b354e51bcf7f33fa299cea"} Apr 22 18:21:08.879668 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.878905 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:08.879668 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.878924 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:08.879668 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.878935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:08.880245 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.880194 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="de180ce0c04fadda2e23172afb9608cfe24d4d6b002d2e9b383386f806557807" exitCode=0 Apr 22 18:21:08.880363 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.880330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"de180ce0c04fadda2e23172afb9608cfe24d4d6b002d2e9b383386f806557807"} Apr 22 18:21:08.894156 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.894132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:08.894364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.894350 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:08.910015 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:08.909977 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" podStartSLOduration=8.213943051 podStartE2EDuration="26.909964876s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.809991787 +0000 UTC m=+1.844865875" lastFinishedPulling="2026-04-22 18:21:02.506013609 +0000 UTC m=+20.540887700" observedRunningTime="2026-04-22 18:21:08.909506137 +0000 UTC m=+26.944380244" watchObservedRunningTime="2026-04-22 18:21:08.909964876 +0000 UTC m=+26.944838987" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:09.632879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:09.632885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:09.632996 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:09.633093 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:09.632886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:09.633291 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:09.633210 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:09.885329 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:09.885014 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="fe074d2a06da1461a68976b6086210a357ace9a62a5028e081eb707f2a9dbd1b" exitCode=0 Apr 22 18:21:09.885861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:09.885105 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"fe074d2a06da1461a68976b6086210a357ace9a62a5028e081eb707f2a9dbd1b"} Apr 22 18:21:10.161071 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.160972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zx8mc"] Apr 22 18:21:10.161071 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.161068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:10.161315 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:10.161160 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:10.162770 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.162744 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jftnj"] Apr 22 18:21:10.162876 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.162834 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:10.162932 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:10.162916 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:10.163458 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.163438 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtjmt"] Apr 22 18:21:10.163528 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.163511 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:10.163633 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:10.163612 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:10.889144 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.889111 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="96d8b1fc562d436050a5c17e6a75e6b442457158f95ca184db94371bdc202f51" exitCode=0 Apr 22 18:21:10.889575 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:10.889205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"96d8b1fc562d436050a5c17e6a75e6b442457158f95ca184db94371bdc202f51"} Apr 22 18:21:11.633100 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:11.633027 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:11.633100 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:11.633073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:11.633335 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:11.633156 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:11.633335 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:11.633270 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:11.633442 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:11.633389 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:11.633442 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:11.633273 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:13.632699 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:13.632669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:13.633258 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:13.632708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:13.633258 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:13.632671 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:13.633258 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:13.632814 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:21:13.633258 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:13.632935 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtjmt" podUID="bb007119-ecfd-4e25-867d-4c7cfb97c10b" Apr 22 18:21:13.633258 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:13.633020 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jftnj" podUID="5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81" Apr 22 18:21:15.233997 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.233971 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-248.ec2.internal" event="NodeReady" Apr 22 18:21:15.234459 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.234115 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:21:15.237305 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.237282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:15.237403 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.237395 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:15.237461 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.237446 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:47.237428911 +0000 UTC m=+65.272302999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:15.298290 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.298257 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-947mz"] Apr 22 18:21:15.327559 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.327477 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fhk6g"] Apr 22 18:21:15.337823 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.337794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:15.337979 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.337963 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:21:15.338035 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.337985 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:21:15.338035 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.337996 2578 projected.go:194] Error preparing data for projected volume kube-api-access-lc8p7 for pod openshift-network-diagnostics/network-check-target-jftnj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:15.338116 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.338052 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7 podName:5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:47.338033786 +0000 UTC m=+65.372907873 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-lc8p7" (UniqueName: "kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7") pod "network-check-target-jftnj" (UID: "5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:21:15.339317 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.339293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-947mz"] Apr 22 18:21:15.339317 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.339321 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fhk6g"] Apr 22 18:21:15.339474 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.339362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.339474 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.339393 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.344068 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.344039 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:21:15.344165 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.344073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:21:15.344340 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.344324 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:21:15.344417 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.344385 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:21:15.344512 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.344494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:21:15.345321 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.345302 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:21:15.345412 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.345323 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:21:15.438158 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-config-volume\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.438371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.438371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6p8h\" (UniqueName: \"kubernetes.io/projected/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-kube-api-access-s6p8h\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.438371 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438343 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-tmp-dir\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.438488 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.438488 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.438467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtp99\" (UniqueName: \"kubernetes.io/projected/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-kube-api-access-xtp99\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.539640 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539600 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.539808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtp99\" (UniqueName: \"kubernetes.io/projected/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-kube-api-access-xtp99\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.539808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-config-volume\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.539808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.539808 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6p8h\" (UniqueName: \"kubernetes.io/projected/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-kube-api-access-s6p8h\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.539957 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.539812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-tmp-dir\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.540103 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.540082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-tmp-dir\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.540213 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.540194 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:15.540409 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.540384 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:15.540527 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.540452 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:16.040432768 +0000 UTC m=+34.075306856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:15.540731 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.540706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-config-volume\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.540930 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:15.540905 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:16.040887582 +0000 UTC m=+34.075761684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:15.556317 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.555943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6p8h\" (UniqueName: \"kubernetes.io/projected/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-kube-api-access-s6p8h\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:15.556862 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.556836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtp99\" (UniqueName: \"kubernetes.io/projected/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-kube-api-access-xtp99\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:15.632785 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.632562 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:15.632936 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.632563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:15.632936 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.632562 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:15.635939 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.635916 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:15.636069 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.635975 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:15.636129 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.636116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:21:15.636209 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.636191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:21:15.636479 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.636463 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:15.636575 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:15.636480 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:16.044176 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:16.044136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:16.044401 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:16.044201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:16.044401 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:16.044325 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:16.044401 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:16.044340 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:16.044521 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:16.044407 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.044386767 +0000 UTC m=+35.079260856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:16.044521 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:16.044432 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.044418439 +0000 UTC m=+35.079292531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:17.051502 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:17.051472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:17.051872 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:17.051516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:17.051872 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:17.051610 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:17.051872 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:17.051619 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:17.051872 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:17.051658 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:19.051645117 +0000 UTC m=+37.086519204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:17.051872 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:17.051671 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:19.051664669 +0000 UTC m=+37.086538757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:17.905006 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:17.904975 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="0505357582a4e9cc3c86a52de0a8900f28548079866cd895a22cb5c4758c8380" exitCode=0 Apr 22 18:21:17.905162 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:17.905020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"0505357582a4e9cc3c86a52de0a8900f28548079866cd895a22cb5c4758c8380"} Apr 22 18:21:18.908920 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:18.908890 2578 generic.go:358] "Generic (PLEG): container finished" podID="94e54bd0-5757-4951-b6b5-0ae58070a297" containerID="d849d22aba8748b96908c3e1f1159cb79f184d2ba588daa0994929860013e73c" exitCode=0 Apr 22 18:21:18.909409 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:18.908941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerDied","Data":"d849d22aba8748b96908c3e1f1159cb79f184d2ba588daa0994929860013e73c"} Apr 22 18:21:19.068433 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:19.068406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:19.068560 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:19.068539 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:19.068604 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:19.068598 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:23.068583434 +0000 UTC m=+41.103457526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:19.068660 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:19.068617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:19.068724 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:19.068709 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:19.068776 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:19.068742 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:23.068735225 +0000 UTC m=+41.103609316 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:19.914611 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:19.914578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" event={"ID":"94e54bd0-5757-4951-b6b5-0ae58070a297","Type":"ContainerStarted","Data":"ad10bf971947b6f29cd715ba6ffb7c3413017f7f71650bfc823a972bb09a866e"} Apr 22 18:21:19.956834 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:19.956744 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fr8p8" podStartSLOduration=4.912482601 podStartE2EDuration="37.956729424s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:20:43.870405493 +0000 UTC m=+1.905279586" lastFinishedPulling="2026-04-22 18:21:16.914652321 +0000 UTC m=+34.949526409" observedRunningTime="2026-04-22 18:21:19.95513722 +0000 UTC m=+37.990011330" watchObservedRunningTime="2026-04-22 18:21:19.956729424 +0000 UTC m=+37.991603552" Apr 22 18:21:23.096889 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.096850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:23.097346 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.096943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:23.097346 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:23.097000 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:23.097346 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:23.097033 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:23.097346 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:23.097061 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:31.097046521 +0000 UTC m=+49.131920609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:23.097346 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:23.097077 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:31.097066398 +0000 UTC m=+49.131940486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:23.500478 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.500445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:23.513711 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.513685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb007119-ecfd-4e25-867d-4c7cfb97c10b-original-pull-secret\") pod \"global-pull-secret-syncer-gtjmt\" (UID: \"bb007119-ecfd-4e25-867d-4c7cfb97c10b\") " pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:23.744340 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.744303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtjmt" Apr 22 18:21:23.908571 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.908544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtjmt"] Apr 22 18:21:23.911736 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:21:23.911707 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb007119_ecfd_4e25_867d_4c7cfb97c10b.slice/crio-3f036667860eb9b013d4f2c92d626066afcaa412667085f1d30ccf27aab73a76 WatchSource:0}: Error finding container 3f036667860eb9b013d4f2c92d626066afcaa412667085f1d30ccf27aab73a76: Status 404 returned error can't find the container with id 3f036667860eb9b013d4f2c92d626066afcaa412667085f1d30ccf27aab73a76 Apr 22 18:21:23.926987 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:23.926957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtjmt" event={"ID":"bb007119-ecfd-4e25-867d-4c7cfb97c10b","Type":"ContainerStarted","Data":"3f036667860eb9b013d4f2c92d626066afcaa412667085f1d30ccf27aab73a76"} Apr 22 18:21:28.938007 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:28.937965 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtjmt" event={"ID":"bb007119-ecfd-4e25-867d-4c7cfb97c10b","Type":"ContainerStarted","Data":"8cd4ce8af363f2ea733570caab67a8bab0172e54370fca3603ae2c07f0d27a08"} Apr 22 18:21:31.152919 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:31.152884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:31.153304 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:31.152955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:31.153304 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:31.153039 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:31.153304 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:31.153047 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:31.153304 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:31.153093 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:47.153080092 +0000 UTC m=+65.187954180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:31.153304 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:31.153112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:21:47.153099093 +0000 UTC m=+65.187973181 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:40.905694 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:40.905666 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fnkv4" Apr 22 18:21:40.938747 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:40.938695 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gtjmt" podStartSLOduration=45.968401614 podStartE2EDuration="49.938679425s" podCreationTimestamp="2026-04-22 18:20:51 +0000 UTC" firstStartedPulling="2026-04-22 18:21:23.913407232 +0000 UTC m=+41.948281335" lastFinishedPulling="2026-04-22 18:21:27.883685056 +0000 UTC m=+45.918559146" observedRunningTime="2026-04-22 18:21:28.954397721 +0000 UTC m=+46.989271843" watchObservedRunningTime="2026-04-22 18:21:40.938679425 +0000 UTC m=+58.973553516" Apr 22 18:21:47.165392 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.165352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:21:47.165773 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.165404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:21:47.165773 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.165496 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:47.165773 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.165501 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:47.165773 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.165545 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:22:19.165532719 +0000 UTC m=+97.200406806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:21:47.165773 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.165558 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:19.165551922 +0000 UTC m=+97.200426011 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:21:47.265936 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.265905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:21:47.268753 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.268734 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:47.277147 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.277133 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:21:47.277195 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:21:47.277189 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:22:51.277170617 +0000 UTC m=+129.312044704 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : secret "metrics-daemon-secret" not found Apr 22 18:21:47.366776 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.366741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:47.369471 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.369450 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:47.379968 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.379951 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:47.390705 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.390680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc8p7\" (UniqueName: \"kubernetes.io/projected/5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81-kube-api-access-lc8p7\") pod \"network-check-target-jftnj\" (UID: \"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81\") " pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:47.467906 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.467841 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cm497\"" Apr 22 18:21:47.476190 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.476172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:47.613475 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.613434 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jftnj"] Apr 22 18:21:47.616680 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:21:47.616649 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9f6b6a_f8b1_47bc_b5a6_effd27fe2e81.slice/crio-02ef88dbb9f77dcc40372de3e360275266a2948246b753702083c96f15f1a0f0 WatchSource:0}: Error finding container 02ef88dbb9f77dcc40372de3e360275266a2948246b753702083c96f15f1a0f0: Status 404 returned error can't find the container with id 02ef88dbb9f77dcc40372de3e360275266a2948246b753702083c96f15f1a0f0 Apr 22 18:21:47.977916 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:47.977881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jftnj" event={"ID":"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81","Type":"ContainerStarted","Data":"02ef88dbb9f77dcc40372de3e360275266a2948246b753702083c96f15f1a0f0"} Apr 22 18:21:49.356064 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.356031 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb"] Apr 22 18:21:49.383800 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.382756 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg"] Apr 22 18:21:49.397469 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.397426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb"] Apr 22 18:21:49.397469 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.397463 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg"] Apr 22 18:21:49.397689 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.397538 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.397689 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.397561 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.400316 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.400282 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:21:49.400452 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.400386 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:21:49.401526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401501 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:21:49.401526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401510 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:21:49.401695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401536 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:21:49.401695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401570 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:21:49.401695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401513 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:21:49.401822 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.401758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:21:49.485701 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/173ff773-b08c-4806-8a1b-a7154e528f5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.485701 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5q2\" (UniqueName: \"kubernetes.io/projected/c71315fd-6856-4e7e-8c21-b0d9725475e3-kube-api-access-vc5q2\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.485914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485725 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.485914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71315fd-6856-4e7e-8c21-b0d9725475e3-tmp\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.485914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485843 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.485914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.485914 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c71315fd-6856-4e7e-8c21-b0d9725475e3-klusterlet-config\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.486138 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.486138 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.485968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgbp\" (UniqueName: \"kubernetes.io/projected/173ff773-b08c-4806-8a1b-a7154e528f5e-kube-api-access-lpgbp\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587129 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5q2\" (UniqueName: \"kubernetes.io/projected/c71315fd-6856-4e7e-8c21-b0d9725475e3-kube-api-access-vc5q2\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.587129 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71315fd-6856-4e7e-8c21-b0d9725475e3-tmp\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c71315fd-6856-4e7e-8c21-b0d9725475e3-klusterlet-config\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgbp\" (UniqueName: \"kubernetes.io/projected/173ff773-b08c-4806-8a1b-a7154e528f5e-kube-api-access-lpgbp\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/173ff773-b08c-4806-8a1b-a7154e528f5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.587754 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.587541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71315fd-6856-4e7e-8c21-b0d9725475e3-tmp\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.588402 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.588167 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/173ff773-b08c-4806-8a1b-a7154e528f5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.590544 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.590493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-ca\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.590646 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.590565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.590646 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.590589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c71315fd-6856-4e7e-8c21-b0d9725475e3-klusterlet-config\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.590785 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.590759 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.590852 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.590838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/173ff773-b08c-4806-8a1b-a7154e528f5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.596300 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.596258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5q2\" (UniqueName: \"kubernetes.io/projected/c71315fd-6856-4e7e-8c21-b0d9725475e3-kube-api-access-vc5q2\") pod \"klusterlet-addon-workmgr-65c4c6c7c8-frccb\" (UID: \"c71315fd-6856-4e7e-8c21-b0d9725475e3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.596410 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.596367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgbp\" (UniqueName: \"kubernetes.io/projected/173ff773-b08c-4806-8a1b-a7154e528f5e-kube-api-access-lpgbp\") pod \"cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg\" (UID: \"173ff773-b08c-4806-8a1b-a7154e528f5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.712179 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.712093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:49.733192 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.732828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:21:49.857952 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.857922 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb"] Apr 22 18:21:49.861571 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:21:49.861544 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc71315fd_6856_4e7e_8c21_b0d9725475e3.slice/crio-2afbe5546f23aa6417bd1ba733bfcc44ed4a8ce4699054dd345922a1ae064ca9 WatchSource:0}: Error finding container 2afbe5546f23aa6417bd1ba733bfcc44ed4a8ce4699054dd345922a1ae064ca9: Status 404 returned error can't find the container with id 2afbe5546f23aa6417bd1ba733bfcc44ed4a8ce4699054dd345922a1ae064ca9 Apr 22 18:21:49.877638 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.877612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg"] Apr 22 18:21:49.880418 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:21:49.880393 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173ff773_b08c_4806_8a1b_a7154e528f5e.slice/crio-6662c73b086f9dd495f11a01beebe57d2f94b018a4300b8a70fefaad72db54d0 WatchSource:0}: Error finding container 6662c73b086f9dd495f11a01beebe57d2f94b018a4300b8a70fefaad72db54d0: Status 404 returned error can't find the container with id 6662c73b086f9dd495f11a01beebe57d2f94b018a4300b8a70fefaad72db54d0 Apr 22 18:21:49.981927 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.981836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerStarted","Data":"6662c73b086f9dd495f11a01beebe57d2f94b018a4300b8a70fefaad72db54d0"} Apr 22 18:21:49.982803 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:49.982782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" event={"ID":"c71315fd-6856-4e7e-8c21-b0d9725475e3","Type":"ContainerStarted","Data":"2afbe5546f23aa6417bd1ba733bfcc44ed4a8ce4699054dd345922a1ae064ca9"} Apr 22 18:21:51.990511 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:51.990476 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jftnj" event={"ID":"5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81","Type":"ContainerStarted","Data":"5b9d058352b992327a73b8fa2fed11eebb9910296569d6911f343a4eca848032"} Apr 22 18:21:51.990930 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:51.990705 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:21:52.012614 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:52.011830 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jftnj" podStartSLOduration=66.598836199 podStartE2EDuration="1m10.01180682s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:21:47.618779718 +0000 UTC m=+65.653653813" lastFinishedPulling="2026-04-22 18:21:51.031750341 +0000 UTC m=+69.066624434" observedRunningTime="2026-04-22 18:21:52.010416448 +0000 UTC m=+70.045290559" watchObservedRunningTime="2026-04-22 18:21:52.01180682 +0000 UTC m=+70.046680960" Apr 22 18:21:53.996324 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:53.996284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerStarted","Data":"09d50568d53ba57250f1761bc088dfe778f3553d338285aba66047be3141ab92"} Apr 22 18:21:56.002063 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:56.002018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" event={"ID":"c71315fd-6856-4e7e-8c21-b0d9725475e3","Type":"ContainerStarted","Data":"f711b35893a8fd6e363d75be50d8582daa0ca5c7f1f83404718a62a2ac0308e0"} Apr 22 18:21:56.002532 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:56.002301 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:56.004255 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:56.004217 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" Apr 22 18:21:56.023274 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:56.023201 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-65c4c6c7c8-frccb" podStartSLOduration=1.908634772 podStartE2EDuration="7.023187366s" podCreationTimestamp="2026-04-22 18:21:49 +0000 UTC" firstStartedPulling="2026-04-22 18:21:49.86341688 +0000 UTC m=+67.898290967" lastFinishedPulling="2026-04-22 18:21:54.977969469 +0000 UTC m=+73.012843561" observedRunningTime="2026-04-22 18:21:56.02129057 +0000 UTC m=+74.056164682" watchObservedRunningTime="2026-04-22 18:21:56.023187366 +0000 UTC m=+74.058061476" Apr 22 18:21:57.008868 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:57.008825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerStarted","Data":"b9d659f5453a5da554b9f6b3c581d6d635b1f7172e0043c77dbb5775d9ea9744"} Apr 22 18:21:57.008868 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:57.008867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerStarted","Data":"ce0eac10cb92f3ec00c3a5b4e31941550a7dfb4137aed455209790e919e50a8f"} Apr 22 18:21:57.031141 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:21:57.031094 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" podStartSLOduration=1.449937395 podStartE2EDuration="8.031081265s" podCreationTimestamp="2026-04-22 18:21:49 +0000 UTC" firstStartedPulling="2026-04-22 18:21:49.882049074 +0000 UTC m=+67.916923161" lastFinishedPulling="2026-04-22 18:21:56.46319294 +0000 UTC m=+74.498067031" observedRunningTime="2026-04-22 18:21:57.029709661 +0000 UTC m=+75.064583772" watchObservedRunningTime="2026-04-22 18:21:57.031081265 +0000 UTC m=+75.065955374" Apr 22 18:22:19.211490 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:19.211452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:22:19.211954 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:19.211515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:22:19.211954 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:19.211596 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:22:19.211954 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:19.211599 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:22:19.211954 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:19.211652 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls podName:52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:23:23.211636596 +0000 UTC m=+161.246510684 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls") pod "dns-default-947mz" (UID: "52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1") : secret "dns-default-metrics-tls" not found Apr 22 18:22:19.211954 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:19.211675 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert podName:d7bb5cb6-67f2-4c61-ad95-b31a5972059f nodeName:}" failed. No retries permitted until 2026-04-22 18:23:23.211661368 +0000 UTC m=+161.246535459 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert") pod "ingress-canary-fhk6g" (UID: "d7bb5cb6-67f2-4c61-ad95-b31a5972059f") : secret "canary-serving-cert" not found Apr 22 18:22:22.995562 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:22.995525 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jftnj" Apr 22 18:22:51.337903 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:51.337857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:22:51.338495 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:51.338018 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:22:51.338495 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:22:51.338116 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs podName:7d04757d-f880-4389-bdfa-265c70b5d789 nodeName:}" failed. No retries permitted until 2026-04-22 18:24:53.338096172 +0000 UTC m=+251.372970277 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs") pod "network-metrics-daemon-zx8mc" (UID: "7d04757d-f880-4389-bdfa-265c70b5d789") : secret "metrics-daemon-secret" not found Apr 22 18:22:57.131979 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:57.131949 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q979r_8468d537-923d-4e8a-a4bc-66ddc1d8ca1d/dns-node-resolver/0.log" Apr 22 18:22:58.131815 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:22:58.131785 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ll75d_bf66aaa8-f33b-4fa8-8d81-ede023dd01d9/node-ca/0.log" Apr 22 18:23:18.169607 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.169576 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhs6"] Apr 22 18:23:18.172513 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.172494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.178206 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.178181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:23:18.179244 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.179206 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:23:18.179356 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.179302 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4b6kd\"" Apr 22 18:23:18.179356 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.179319 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:23:18.179356 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.179324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:23:18.189123 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.189102 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhs6"] Apr 22 18:23:18.212364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.212093 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19276b6f-c2d1-44bd-8496-af548e90bab2-data-volume\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.212364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.212164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19276b6f-c2d1-44bd-8496-af548e90bab2-crio-socket\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.212364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.212222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19276b6f-c2d1-44bd-8496-af548e90bab2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.212364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.212284 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzq4m\" (UniqueName: \"kubernetes.io/projected/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-api-access-fzq4m\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.212364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.212350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.214689 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.214664 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-5q55v"] Apr 22 18:23:18.217404 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.217390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:18.221707 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.221690 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:23:18.222364 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.222344 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-kvfcx\"" Apr 22 18:23:18.222446 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.222391 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:23:18.235613 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.235585 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5q55v"] Apr 22 18:23:18.312943 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.312910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19276b6f-c2d1-44bd-8496-af548e90bab2-data-volume\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313118 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.312965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19276b6f-c2d1-44bd-8496-af548e90bab2-crio-socket\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313118 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.312997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19276b6f-c2d1-44bd-8496-af548e90bab2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313118 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzq4m\" (UniqueName: \"kubernetes.io/projected/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-api-access-fzq4m\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313318 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19276b6f-c2d1-44bd-8496-af548e90bab2-crio-socket\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313318 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrkd5\" (UniqueName: \"kubernetes.io/projected/2e9d320b-195a-45d5-8b33-1c038e52b361-kube-api-access-nrkd5\") pod \"downloads-6bcc868b7-5q55v\" (UID: \"2e9d320b-195a-45d5-8b33-1c038e52b361\") " pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:18.313318 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19276b6f-c2d1-44bd-8496-af548e90bab2-data-volume\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313318 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.313715 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.313634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.315455 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.315433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19276b6f-c2d1-44bd-8496-af548e90bab2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.327015 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.326985 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzq4m\" (UniqueName: \"kubernetes.io/projected/19276b6f-c2d1-44bd-8496-af548e90bab2-kube-api-access-fzq4m\") pod \"insights-runtime-extractor-pxhs6\" (UID: \"19276b6f-c2d1-44bd-8496-af548e90bab2\") " pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.351915 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:23:18.351887 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-947mz" podUID="52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1" Apr 22 18:23:18.358040 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:23:18.358014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fhk6g" podUID="d7bb5cb6-67f2-4c61-ad95-b31a5972059f" Apr 22 18:23:18.414046 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.414006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrkd5\" (UniqueName: \"kubernetes.io/projected/2e9d320b-195a-45d5-8b33-1c038e52b361-kube-api-access-nrkd5\") pod \"downloads-6bcc868b7-5q55v\" (UID: \"2e9d320b-195a-45d5-8b33-1c038e52b361\") " pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:18.423314 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.423263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrkd5\" (UniqueName: \"kubernetes.io/projected/2e9d320b-195a-45d5-8b33-1c038e52b361-kube-api-access-nrkd5\") pod \"downloads-6bcc868b7-5q55v\" (UID: \"2e9d320b-195a-45d5-8b33-1c038e52b361\") " pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:18.481379 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.481359 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pxhs6" Apr 22 18:23:18.526890 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.526359 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:18.608365 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.608327 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhs6"] Apr 22 18:23:18.612294 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:18.612261 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19276b6f_c2d1_44bd_8496_af548e90bab2.slice/crio-5380263ab629919feafc7ec7f38639742386754577407817c989d4029d2706a2 WatchSource:0}: Error finding container 5380263ab629919feafc7ec7f38639742386754577407817c989d4029d2706a2: Status 404 returned error can't find the container with id 5380263ab629919feafc7ec7f38639742386754577407817c989d4029d2706a2 Apr 22 18:23:18.652284 ip-10-0-128-248 kubenswrapper[2578]: E0422 18:23:18.652256 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zx8mc" podUID="7d04757d-f880-4389-bdfa-265c70b5d789" Apr 22 18:23:18.654244 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:18.654202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-5q55v"] Apr 22 18:23:18.658008 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:18.657981 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9d320b_195a_45d5_8b33_1c038e52b361.slice/crio-6001cd55bfa589bebac77e10b9a30ed874016bf656f9520d2bb8f0c0414a015d WatchSource:0}: Error finding container 6001cd55bfa589bebac77e10b9a30ed874016bf656f9520d2bb8f0c0414a015d: Status 404 returned error can't find the container with id 6001cd55bfa589bebac77e10b9a30ed874016bf656f9520d2bb8f0c0414a015d Apr 22 18:23:19.198584 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:19.198542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhs6" event={"ID":"19276b6f-c2d1-44bd-8496-af548e90bab2","Type":"ContainerStarted","Data":"e6b3fa2fc382c23f3b6455e817942c5cf453310a8a32d47377cab71ce905352e"} Apr 22 18:23:19.199036 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:19.198594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhs6" event={"ID":"19276b6f-c2d1-44bd-8496-af548e90bab2","Type":"ContainerStarted","Data":"5380263ab629919feafc7ec7f38639742386754577407817c989d4029d2706a2"} Apr 22 18:23:19.200117 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:19.200086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5q55v" event={"ID":"2e9d320b-195a-45d5-8b33-1c038e52b361","Type":"ContainerStarted","Data":"6001cd55bfa589bebac77e10b9a30ed874016bf656f9520d2bb8f0c0414a015d"} Apr 22 18:23:19.200266 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:19.200149 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-947mz" Apr 22 18:23:20.208171 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:20.208134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhs6" event={"ID":"19276b6f-c2d1-44bd-8496-af548e90bab2","Type":"ContainerStarted","Data":"4bee572126a57e1fb629904efda8ebaa4f81276477da959bacf13b44f5d6e415"} Apr 22 18:23:21.213083 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:21.212993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhs6" event={"ID":"19276b6f-c2d1-44bd-8496-af548e90bab2","Type":"ContainerStarted","Data":"ab5570c484ff8960085a5b9f0e0872cb89e23fc8d62c36df4f0710e53fd4b95b"} Apr 22 18:23:21.236246 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:21.236185 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pxhs6" podStartSLOduration=1.055692693 podStartE2EDuration="3.236172308s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.669370271 +0000 UTC m=+156.704244359" lastFinishedPulling="2026-04-22 18:23:20.849849886 +0000 UTC m=+158.884723974" observedRunningTime="2026-04-22 18:23:21.234724557 +0000 UTC m=+159.269598668" watchObservedRunningTime="2026-04-22 18:23:21.236172308 +0000 UTC m=+159.271046428" Apr 22 18:23:23.255782 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.255745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:23:23.256208 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.255809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:23:23.258523 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.258498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1-metrics-tls\") pod \"dns-default-947mz\" (UID: \"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1\") " pod="openshift-dns/dns-default-947mz" Apr 22 18:23:23.258633 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.258565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7bb5cb6-67f2-4c61-ad95-b31a5972059f-cert\") pod \"ingress-canary-fhk6g\" (UID: \"d7bb5cb6-67f2-4c61-ad95-b31a5972059f\") " pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:23:23.404318 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.404281 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9d4f9\"" Apr 22 18:23:23.411732 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.411698 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-947mz" Apr 22 18:23:23.548207 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:23.548132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-947mz"] Apr 22 18:23:23.551972 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:23.551938 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d7d6c3_fe6a_45b8_8b04_1cbb6c3f98b1.slice/crio-dcbd77f8acfe3cc9993bbabba24e6d639a5c2a7815b0e717f19bd88b4d86e825 WatchSource:0}: Error finding container dcbd77f8acfe3cc9993bbabba24e6d639a5c2a7815b0e717f19bd88b4d86e825: Status 404 returned error can't find the container with id dcbd77f8acfe3cc9993bbabba24e6d639a5c2a7815b0e717f19bd88b4d86e825 Apr 22 18:23:24.227533 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:24.227487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-947mz" event={"ID":"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1","Type":"ContainerStarted","Data":"dcbd77f8acfe3cc9993bbabba24e6d639a5c2a7815b0e717f19bd88b4d86e825"} Apr 22 18:23:25.232381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:25.232344 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-947mz" event={"ID":"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1","Type":"ContainerStarted","Data":"782594225a9e37beb58461e36e78f4ac7e1b6f50ee89d87436b2ab2fa1257bd7"} Apr 22 18:23:25.232381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:25.232382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-947mz" event={"ID":"52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1","Type":"ContainerStarted","Data":"24cbe71bdcc7fd2934f60f3dd12ab438189893a987955a5064f898da7817add4"} Apr 22 18:23:25.232868 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:25.232484 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-947mz" Apr 22 18:23:25.267152 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:25.267093 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-947mz" podStartSLOduration=128.919890637 podStartE2EDuration="2m10.267075289s" podCreationTimestamp="2026-04-22 18:21:15 +0000 UTC" firstStartedPulling="2026-04-22 18:23:23.553864921 +0000 UTC m=+161.588739025" lastFinishedPulling="2026-04-22 18:23:24.90104959 +0000 UTC m=+162.935923677" observedRunningTime="2026-04-22 18:23:25.266677625 +0000 UTC m=+163.301551748" watchObservedRunningTime="2026-04-22 18:23:25.267075289 +0000 UTC m=+163.301949400" Apr 22 18:23:29.633338 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:29.633256 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:23:29.734629 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:29.734558 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" podUID="173ff773-b08c-4806-8a1b-a7154e528f5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:23:33.632850 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.632808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:23:33.635944 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.635916 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pkrg\"" Apr 22 18:23:33.643508 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.643480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fhk6g" Apr 22 18:23:33.801516 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.801482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc"] Apr 22 18:23:33.809572 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.809544 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:33.812503 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.812480 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:23:33.812630 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.812609 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:23:33.812630 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.812615 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gzrlz\"" Apr 22 18:23:33.813502 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.813476 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:23:33.813645 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.813622 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:23:33.814929 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.814903 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:23:33.830779 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.830754 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jccws"] Apr 22 18:23:33.837646 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.837624 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.841129 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.841109 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7268c\"" Apr 22 18:23:33.846533 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.844154 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:23:33.846533 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.844488 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:23:33.846533 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.844676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:23:33.910918 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.910844 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc"] Apr 22 18:23:33.953127 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-tls\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953310 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jss\" (UniqueName: \"kubernetes.io/projected/fa0f2387-2074-4f15-b590-4e01da2ae06f-kube-api-access-46jss\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:33.953310 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953171 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-root\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953310 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-sys\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953310 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-metrics-client-ca\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953484 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-wtmp\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953484 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953354 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0f2387-2074-4f15-b590-4e01da2ae06f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:33.953484 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953484 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:33.953676 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-textfile\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953676 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9x6c\" (UniqueName: \"kubernetes.io/projected/fab23829-7f89-4abc-ad9e-f76bcecd64d5-kube-api-access-j9x6c\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953676 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953565 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:33.953676 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:33.953604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.054380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.054588 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.054588 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-textfile\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.054588 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9x6c\" (UniqueName: \"kubernetes.io/projected/fab23829-7f89-4abc-ad9e-f76bcecd64d5-kube-api-access-j9x6c\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.054745 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.054745 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.054745 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-tls\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.054745 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46jss\" (UniqueName: \"kubernetes.io/projected/fa0f2387-2074-4f15-b590-4e01da2ae06f-kube-api-access-46jss\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.054745 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-root\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-sys\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054783 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-metrics-client-ca\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-wtmp\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-textfile\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0f2387-2074-4f15-b590-4e01da2ae06f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-sys\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055002 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.054943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-root\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055427 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.055056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-accelerators-collector-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055427 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.055211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-wtmp\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055543 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.055443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fab23829-7f89-4abc-ad9e-f76bcecd64d5-metrics-client-ca\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.055849 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.055794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa0f2387-2074-4f15-b590-4e01da2ae06f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.057703 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.057668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.058221 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.058197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.058357 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.058300 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0f2387-2074-4f15-b590-4e01da2ae06f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.062668 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.062641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fab23829-7f89-4abc-ad9e-f76bcecd64d5-node-exporter-tls\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.064659 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.064634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9x6c\" (UniqueName: \"kubernetes.io/projected/fab23829-7f89-4abc-ad9e-f76bcecd64d5-kube-api-access-j9x6c\") pod \"node-exporter-jccws\" (UID: \"fab23829-7f89-4abc-ad9e-f76bcecd64d5\") " pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.065179 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.065157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jss\" (UniqueName: \"kubernetes.io/projected/fa0f2387-2074-4f15-b590-4e01da2ae06f-kube-api-access-46jss\") pod \"openshift-state-metrics-9d44df66c-7w8bc\" (UID: \"fa0f2387-2074-4f15-b590-4e01da2ae06f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.121034 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.120992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" Apr 22 18:23:34.153069 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.153040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jccws" Apr 22 18:23:34.183916 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:34.183884 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab23829_7f89_4abc_ad9e_f76bcecd64d5.slice/crio-2abbfc0dd5ce9bd9768d7978fe1b79d26a612e9c444b90028b8e62d8a85e8dd2 WatchSource:0}: Error finding container 2abbfc0dd5ce9bd9768d7978fe1b79d26a612e9c444b90028b8e62d8a85e8dd2: Status 404 returned error can't find the container with id 2abbfc0dd5ce9bd9768d7978fe1b79d26a612e9c444b90028b8e62d8a85e8dd2 Apr 22 18:23:34.267589 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.267526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jccws" event={"ID":"fab23829-7f89-4abc-ad9e-f76bcecd64d5","Type":"ContainerStarted","Data":"2abbfc0dd5ce9bd9768d7978fe1b79d26a612e9c444b90028b8e62d8a85e8dd2"} Apr 22 18:23:34.333636 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.333602 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fhk6g"] Apr 22 18:23:34.348543 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:34.348514 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7bb5cb6_67f2_4c61_ad95_b31a5972059f.slice/crio-d22fa33c57ecf9d63836396231e25cba056078aee935a87340271633772a58c8 WatchSource:0}: Error finding container d22fa33c57ecf9d63836396231e25cba056078aee935a87340271633772a58c8: Status 404 returned error can't find the container with id d22fa33c57ecf9d63836396231e25cba056078aee935a87340271633772a58c8 Apr 22 18:23:34.349315 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:34.349292 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0f2387_2074_4f15_b590_4e01da2ae06f.slice/crio-e38ea1aa358c8b5f28f5c9fcf1180c28cbd00590f97731d0a93d625b5398276c WatchSource:0}: Error finding container e38ea1aa358c8b5f28f5c9fcf1180c28cbd00590f97731d0a93d625b5398276c: Status 404 returned error can't find the container with id e38ea1aa358c8b5f28f5c9fcf1180c28cbd00590f97731d0a93d625b5398276c Apr 22 18:23:34.349537 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:34.349518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc"] Apr 22 18:23:35.238558 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.238529 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-947mz" Apr 22 18:23:35.274401 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.274332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-5q55v" event={"ID":"2e9d320b-195a-45d5-8b33-1c038e52b361","Type":"ContainerStarted","Data":"8e3274ac2ac0df9315332f4d2283f16ac04c31221ab8208afdb1874dcf1deb87"} Apr 22 18:23:35.274580 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.274416 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:35.276359 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.275697 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fhk6g" event={"ID":"d7bb5cb6-67f2-4c61-ad95-b31a5972059f","Type":"ContainerStarted","Data":"d22fa33c57ecf9d63836396231e25cba056078aee935a87340271633772a58c8"} Apr 22 18:23:35.278614 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.278587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" event={"ID":"fa0f2387-2074-4f15-b590-4e01da2ae06f","Type":"ContainerStarted","Data":"cd01a04ef4177859e00ba49fbe19e3f8c1b1232a9971f327076c56705e636f2c"} Apr 22 18:23:35.278737 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.278623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" event={"ID":"fa0f2387-2074-4f15-b590-4e01da2ae06f","Type":"ContainerStarted","Data":"82a1f1dc94784b7c180c8382f441ce8c876ff41e45174e0f32107493366b20f4"} Apr 22 18:23:35.278737 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.278637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" event={"ID":"fa0f2387-2074-4f15-b590-4e01da2ae06f","Type":"ContainerStarted","Data":"e38ea1aa358c8b5f28f5c9fcf1180c28cbd00590f97731d0a93d625b5398276c"} Apr 22 18:23:35.287853 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.287829 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-5q55v" Apr 22 18:23:35.292374 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:35.292176 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-5q55v" podStartSLOduration=1.517257743 podStartE2EDuration="17.292158281s" podCreationTimestamp="2026-04-22 18:23:18 +0000 UTC" firstStartedPulling="2026-04-22 18:23:18.659716971 +0000 UTC m=+156.694591076" lastFinishedPulling="2026-04-22 18:23:34.434617511 +0000 UTC m=+172.469491614" observedRunningTime="2026-04-22 18:23:35.291606974 +0000 UTC m=+173.326481095" watchObservedRunningTime="2026-04-22 18:23:35.292158281 +0000 UTC m=+173.327032392" Apr 22 18:23:36.293032 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:36.292959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jccws" event={"ID":"fab23829-7f89-4abc-ad9e-f76bcecd64d5","Type":"ContainerStarted","Data":"27d0977ca28beda619e66b6fcd47a8b87b7fc61d949e9b35edb496a4a379aca7"} Apr 22 18:23:37.296698 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.296664 2578 generic.go:358] "Generic (PLEG): container finished" podID="fab23829-7f89-4abc-ad9e-f76bcecd64d5" containerID="27d0977ca28beda619e66b6fcd47a8b87b7fc61d949e9b35edb496a4a379aca7" exitCode=0 Apr 22 18:23:37.297127 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.296762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jccws" event={"ID":"fab23829-7f89-4abc-ad9e-f76bcecd64d5","Type":"ContainerDied","Data":"27d0977ca28beda619e66b6fcd47a8b87b7fc61d949e9b35edb496a4a379aca7"} Apr 22 18:23:37.299220 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.299190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fhk6g" event={"ID":"d7bb5cb6-67f2-4c61-ad95-b31a5972059f","Type":"ContainerStarted","Data":"a3735368393ac4b196f0531fa5fdd6d8250769e5d9c87aa6e55ca20343e4543a"} Apr 22 18:23:37.302793 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.302721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" event={"ID":"fa0f2387-2074-4f15-b590-4e01da2ae06f","Type":"ContainerStarted","Data":"8337fa91d6da842aaa930aef5d5ab43a6f16b23192118783cce1cfd977cd3367"} Apr 22 18:23:37.339139 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.338414 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7w8bc" podStartSLOduration=1.836584153 podStartE2EDuration="4.338396069s" podCreationTimestamp="2026-04-22 18:23:33 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.66184957 +0000 UTC m=+172.696723658" lastFinishedPulling="2026-04-22 18:23:37.163661486 +0000 UTC m=+175.198535574" observedRunningTime="2026-04-22 18:23:37.337543203 +0000 UTC m=+175.372417313" watchObservedRunningTime="2026-04-22 18:23:37.338396069 +0000 UTC m=+175.373270180" Apr 22 18:23:37.354579 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:37.354527 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fhk6g" podStartSLOduration=139.574370953 podStartE2EDuration="2m22.35450814s" podCreationTimestamp="2026-04-22 18:21:15 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.381395691 +0000 UTC m=+172.416269780" lastFinishedPulling="2026-04-22 18:23:37.161532865 +0000 UTC m=+175.196406967" observedRunningTime="2026-04-22 18:23:37.35342776 +0000 UTC m=+175.388301870" watchObservedRunningTime="2026-04-22 18:23:37.35450814 +0000 UTC m=+175.389382252" Apr 22 18:23:38.306865 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:38.306823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jccws" event={"ID":"fab23829-7f89-4abc-ad9e-f76bcecd64d5","Type":"ContainerStarted","Data":"6c9128e51ec9b434b7fba12bc584d8ee81f295c58166c388d68dcd13d47fb423"} Apr 22 18:23:38.306865 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:38.306872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jccws" event={"ID":"fab23829-7f89-4abc-ad9e-f76bcecd64d5","Type":"ContainerStarted","Data":"58db9f5f093e9752d557431a0bc59474f6f67cd326e7fdca52750c8bd527e214"} Apr 22 18:23:38.332431 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:38.332378 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jccws" podStartSLOduration=4.107701558 podStartE2EDuration="5.332362995s" podCreationTimestamp="2026-04-22 18:23:33 +0000 UTC" firstStartedPulling="2026-04-22 18:23:34.186160872 +0000 UTC m=+172.221034960" lastFinishedPulling="2026-04-22 18:23:35.410822295 +0000 UTC m=+173.445696397" observedRunningTime="2026-04-22 18:23:38.330293767 +0000 UTC m=+176.365167879" watchObservedRunningTime="2026-04-22 18:23:38.332362995 +0000 UTC m=+176.367237104" Apr 22 18:23:39.733899 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:39.733862 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" podUID="173ff773-b08c-4806-8a1b-a7154e528f5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:23:40.036183 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.036143 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:40.058161 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.058127 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:40.058378 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.058354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.061010 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.060987 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-hgk95\"" Apr 22 18:23:40.061135 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061118 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:23:40.061212 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061078 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:23:40.061316 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061298 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:23:40.061361 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:23:40.061499 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.060995 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:23:40.061680 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061660 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:23:40.062006 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.061978 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:23:40.062626 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.062605 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:23:40.062726 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.062658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:23:40.062829 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.062814 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:23:40.062867 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.062834 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:23:40.063488 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.063471 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d5t4sqmd2pu1d\"" Apr 22 18:23:40.063596 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.063508 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:23:40.064887 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.064865 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:23:40.219860 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.219860 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220109 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220109 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220109 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220109 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.219984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config-out\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220109 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96cqn\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-kube-api-access-96cqn\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-web-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220381 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.220651 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.220615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.321902 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.321821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.321902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.321928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.321959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.321976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322073 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config-out\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96cqn\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-kube-api-access-96cqn\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.322355 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.322331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-web-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.323918 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.323890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.324092 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.323979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.325244 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.325202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.325473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-web-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.325492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.325808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.325847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.326113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.326526 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.326458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.327342 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.327289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.328321 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.328148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.328425 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.328367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-config-out\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.328484 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.328434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.329021 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.328976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.329292 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.329271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.330632 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.330598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.334815 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.334796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96cqn\" (UniqueName: \"kubernetes.io/projected/71759ecc-afc4-4d79-bbb6-1e7abb060ea4-kube-api-access-96cqn\") pod \"prometheus-k8s-0\" (UID: \"71759ecc-afc4-4d79-bbb6-1e7abb060ea4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.370857 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.370828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:23:40.516457 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:40.516412 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:23:40.521531 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:23:40.521502 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71759ecc_afc4_4d79_bbb6_1e7abb060ea4.slice/crio-dd640d61bd8a711c38b0848981e5b73bc84851ebc79c589ff06a619aa3df1142 WatchSource:0}: Error finding container dd640d61bd8a711c38b0848981e5b73bc84851ebc79c589ff06a619aa3df1142: Status 404 returned error can't find the container with id dd640d61bd8a711c38b0848981e5b73bc84851ebc79c589ff06a619aa3df1142 Apr 22 18:23:41.326150 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:41.326107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"dd640d61bd8a711c38b0848981e5b73bc84851ebc79c589ff06a619aa3df1142"} Apr 22 18:23:43.334989 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:43.334944 2578 generic.go:358] "Generic (PLEG): container finished" podID="71759ecc-afc4-4d79-bbb6-1e7abb060ea4" containerID="8a9865148bd37dcff7693ab1fa714d27508b711935613d34d6b1554c42118835" exitCode=0 Apr 22 18:23:43.335449 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:43.335020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerDied","Data":"8a9865148bd37dcff7693ab1fa714d27508b711935613d34d6b1554c42118835"} Apr 22 18:23:47.346871 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:47.346797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"c89658e010ec8c66724a5ee7020a995c2eaf16c2ec43549c0a5e4a69e4c1b2f0"} Apr 22 18:23:47.346871 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:47.346831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"ac451d9888b0bdc1e3fbd1b8d936503cadcb235d792c0e116150543871676ef9"} Apr 22 18:23:49.734213 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:49.734159 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" podUID="173ff773-b08c-4806-8a1b-a7154e528f5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:23:49.734701 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:49.734274 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" Apr 22 18:23:49.734842 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:49.734806 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"b9d659f5453a5da554b9f6b3c581d6d635b1f7172e0043c77dbb5775d9ea9744"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:23:49.734903 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:49.734883 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" podUID="173ff773-b08c-4806-8a1b-a7154e528f5e" containerName="service-proxy" containerID="cri-o://b9d659f5453a5da554b9f6b3c581d6d635b1f7172e0043c77dbb5775d9ea9744" gracePeriod=30 Apr 22 18:23:50.356659 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:50.356627 2578 generic.go:358] "Generic (PLEG): container finished" podID="173ff773-b08c-4806-8a1b-a7154e528f5e" containerID="b9d659f5453a5da554b9f6b3c581d6d635b1f7172e0043c77dbb5775d9ea9744" exitCode=2 Apr 22 18:23:50.356760 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:50.356699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerDied","Data":"b9d659f5453a5da554b9f6b3c581d6d635b1f7172e0043c77dbb5775d9ea9744"} Apr 22 18:23:50.356760 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:50.356747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f8ddfdcdf-rzvmg" event={"ID":"173ff773-b08c-4806-8a1b-a7154e528f5e","Type":"ContainerStarted","Data":"9c02e5f890e6439e652755f9586ecede7fbc383a6e98280b0034992d600f4edd"} Apr 22 18:23:51.362077 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:51.362038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"1e235035baa8c30744b3767fc687c60847ddb513e60bb5c9b36ccaf1d2eee382"} Apr 22 18:23:51.362077 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:51.362077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"078fbc163a902bdf725c4e927a9a84fd0e93830ca817d7ada56ed31efcdc03db"} Apr 22 18:23:51.362077 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:51.362086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"1d6612b7ce27623417b89cdbad9ee7f376ae7356140e188090462c7ecb5edfee"} Apr 22 18:23:51.362681 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:51.362095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"71759ecc-afc4-4d79-bbb6-1e7abb060ea4","Type":"ContainerStarted","Data":"716e7746d0350eb1521dab09ce8cf1133e2ce5d058da38257b6a5360e15bb889"} Apr 22 18:23:51.397906 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:51.397857 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.656207499 podStartE2EDuration="11.397842728s" podCreationTimestamp="2026-04-22 18:23:40 +0000 UTC" firstStartedPulling="2026-04-22 18:23:40.524171586 +0000 UTC m=+178.559045674" lastFinishedPulling="2026-04-22 18:23:50.265806604 +0000 UTC m=+188.300680903" observedRunningTime="2026-04-22 18:23:51.396420977 +0000 UTC m=+189.431295092" watchObservedRunningTime="2026-04-22 18:23:51.397842728 +0000 UTC m=+189.432716838" Apr 22 18:23:55.371494 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:23:55.371458 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:40.371280 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:40.371211 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:40.391380 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:40.391356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:40.511804 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:40.511777 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:24:53.426777 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:53.426739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:24:53.429194 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:53.429176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d04757d-f880-4389-bdfa-265c70b5d789-metrics-certs\") pod \"network-metrics-daemon-zx8mc\" (UID: \"7d04757d-f880-4389-bdfa-265c70b5d789\") " pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:24:53.636652 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:53.636620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fll8t\"" Apr 22 18:24:53.644012 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:53.643991 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zx8mc" Apr 22 18:24:53.763022 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:53.762990 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zx8mc"] Apr 22 18:24:53.766051 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:24:53.766017 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d04757d_f880_4389_bdfa_265c70b5d789.slice/crio-0725e25cc301de984d9730357817da8165962419cb1b8fd41c6b247abd17883f WatchSource:0}: Error finding container 0725e25cc301de984d9730357817da8165962419cb1b8fd41c6b247abd17883f: Status 404 returned error can't find the container with id 0725e25cc301de984d9730357817da8165962419cb1b8fd41c6b247abd17883f Apr 22 18:24:54.537092 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:54.537051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zx8mc" event={"ID":"7d04757d-f880-4389-bdfa-265c70b5d789","Type":"ContainerStarted","Data":"0725e25cc301de984d9730357817da8165962419cb1b8fd41c6b247abd17883f"} Apr 22 18:24:55.541086 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:55.541054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zx8mc" event={"ID":"7d04757d-f880-4389-bdfa-265c70b5d789","Type":"ContainerStarted","Data":"552a6d184c44cdf1671be30e242c0144837e1db1b20399c92630d454e48f288f"} Apr 22 18:24:55.541086 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:55.541087 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zx8mc" event={"ID":"7d04757d-f880-4389-bdfa-265c70b5d789","Type":"ContainerStarted","Data":"90cb54c90fc903c57327cf94d95cb835da0a50f5a69e7df97bee2ed3e7272799"} Apr 22 18:24:55.558858 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:24:55.558810 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zx8mc" podStartSLOduration=252.674120526 podStartE2EDuration="4m13.558795997s" podCreationTimestamp="2026-04-22 18:20:42 +0000 UTC" firstStartedPulling="2026-04-22 18:24:53.767749858 +0000 UTC m=+251.802623946" lastFinishedPulling="2026-04-22 18:24:54.652425326 +0000 UTC m=+252.687299417" observedRunningTime="2026-04-22 18:24:55.556851325 +0000 UTC m=+253.591725436" watchObservedRunningTime="2026-04-22 18:24:55.558795997 +0000 UTC m=+253.593670107" Apr 22 18:25:42.502100 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:25:42.502077 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:26:02.260747 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:02.260720 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gtjmt_bb007119-ecfd-4e25-867d-4c7cfb97c10b/global-pull-secret-syncer/0.log" Apr 22 18:26:02.421646 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:02.421617 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jgjjg_4acd2902-b094-4f0a-a902-526fdb5ba7de/konnectivity-agent/0.log" Apr 22 18:26:02.447106 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:02.447074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-248.ec2.internal_d579a59ca2c3bf4b3f744c41961ff1e1/haproxy/0.log" Apr 22 18:26:06.188324 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.188296 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jccws_fab23829-7f89-4abc-ad9e-f76bcecd64d5/node-exporter/0.log" Apr 22 18:26:06.216894 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.216869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jccws_fab23829-7f89-4abc-ad9e-f76bcecd64d5/kube-rbac-proxy/0.log" Apr 22 18:26:06.261461 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.261436 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jccws_fab23829-7f89-4abc-ad9e-f76bcecd64d5/init-textfile/0.log" Apr 22 18:26:06.445196 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.445113 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7w8bc_fa0f2387-2074-4f15-b590-4e01da2ae06f/kube-rbac-proxy-main/0.log" Apr 22 18:26:06.478597 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.478574 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7w8bc_fa0f2387-2074-4f15-b590-4e01da2ae06f/kube-rbac-proxy-self/0.log" Apr 22 18:26:06.507861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.507830 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7w8bc_fa0f2387-2074-4f15-b590-4e01da2ae06f/openshift-state-metrics/0.log" Apr 22 18:26:06.558451 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.558423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/prometheus/0.log" Apr 22 18:26:06.588245 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.588199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/config-reloader/0.log" Apr 22 18:26:06.619154 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.619079 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/thanos-sidecar/0.log" Apr 22 18:26:06.645979 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.645959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/kube-rbac-proxy-web/0.log" Apr 22 18:26:06.670290 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.670270 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/kube-rbac-proxy/0.log" Apr 22 18:26:06.698102 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.698041 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/kube-rbac-proxy-thanos/0.log" Apr 22 18:26:06.723734 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:06.723714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_71759ecc-afc4-4d79-bbb6-1e7abb060ea4/init-config-reloader/0.log" Apr 22 18:26:08.491876 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.491841 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4"] Apr 22 18:26:08.495205 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.495184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.497931 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.497901 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"openshift-service-ca.crt\"" Apr 22 18:26:08.498985 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.498962 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5v9l6\"/\"kube-root-ca.crt\"" Apr 22 18:26:08.499093 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.498962 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5v9l6\"/\"default-dockercfg-p2l25\"" Apr 22 18:26:08.503875 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.503842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4"] Apr 22 18:26:08.597125 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.597095 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-sys\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.597358 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.597140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-podres\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.597358 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.597170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-proc\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.597358 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.597218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9sb\" (UniqueName: \"kubernetes.io/projected/b58d6efa-5e8f-4c92-9fac-716a583c90c7-kube-api-access-rs9sb\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.597358 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.597282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-lib-modules\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-sys\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-podres\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-proc\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9sb\" (UniqueName: \"kubernetes.io/projected/b58d6efa-5e8f-4c92-9fac-716a583c90c7-kube-api-access-rs9sb\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-lib-modules\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-sys\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.698861 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-proc\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.699072 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-podres\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.699072 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.698908 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b58d6efa-5e8f-4c92-9fac-716a583c90c7-lib-modules\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.707455 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.707424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9sb\" (UniqueName: \"kubernetes.io/projected/b58d6efa-5e8f-4c92-9fac-716a583c90c7-kube-api-access-rs9sb\") pod \"perf-node-gather-daemonset-fpdm4\" (UID: \"b58d6efa-5e8f-4c92-9fac-716a583c90c7\") " pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.786374 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.786254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-5q55v_2e9d320b-195a-45d5-8b33-1c038e52b361/download-server/0.log" Apr 22 18:26:08.807384 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.807361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:08.930934 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.930903 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4"] Apr 22 18:26:08.934433 ip-10-0-128-248 kubenswrapper[2578]: W0422 18:26:08.934404 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb58d6efa_5e8f_4c92_9fac_716a583c90c7.slice/crio-64a8d019019d16f94e6016dcf47a2efb2b8465fbbfe2ca6557c0bd5f8ea82f09 WatchSource:0}: Error finding container 64a8d019019d16f94e6016dcf47a2efb2b8465fbbfe2ca6557c0bd5f8ea82f09: Status 404 returned error can't find the container with id 64a8d019019d16f94e6016dcf47a2efb2b8465fbbfe2ca6557c0bd5f8ea82f09 Apr 22 18:26:08.935884 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:08.935868 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:26:09.734589 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.734558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" event={"ID":"b58d6efa-5e8f-4c92-9fac-716a583c90c7","Type":"ContainerStarted","Data":"d6f02a6c41b99dd9e660b2e03474447ef9d2d3e4a4cff59b863c8bd578ac572d"} Apr 22 18:26:09.734589 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.734592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" event={"ID":"b58d6efa-5e8f-4c92-9fac-716a583c90c7","Type":"ContainerStarted","Data":"64a8d019019d16f94e6016dcf47a2efb2b8465fbbfe2ca6557c0bd5f8ea82f09"} Apr 22 18:26:09.735008 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.734690 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:09.752616 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.752557 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" podStartSLOduration=1.7525405059999999 podStartE2EDuration="1.752540506s" podCreationTimestamp="2026-04-22 18:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:26:09.751543306 +0000 UTC m=+327.786417412" watchObservedRunningTime="2026-04-22 18:26:09.752540506 +0000 UTC m=+327.787414630" Apr 22 18:26:09.843359 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.843332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-947mz_52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1/dns/0.log" Apr 22 18:26:09.866958 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.866934 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-947mz_52d7d6c3-fe6a-45b8-8b04-1cbb6c3f98b1/kube-rbac-proxy/0.log" Apr 22 18:26:09.962107 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:09.962078 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q979r_8468d537-923d-4e8a-a4bc-66ddc1d8ca1d/dns-node-resolver/0.log" Apr 22 18:26:10.414656 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:10.414627 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ll75d_bf66aaa8-f33b-4fa8-8d81-ede023dd01d9/node-ca/0.log" Apr 22 18:26:11.461220 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:11.461190 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fhk6g_d7bb5cb6-67f2-4c61-ad95-b31a5972059f/serve-healthcheck-canary/0.log" Apr 22 18:26:11.925819 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:11.925790 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhs6_19276b6f-c2d1-44bd-8496-af548e90bab2/kube-rbac-proxy/0.log" Apr 22 18:26:11.950841 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:11.950816 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhs6_19276b6f-c2d1-44bd-8496-af548e90bab2/exporter/0.log" Apr 22 18:26:11.974931 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:11.974906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhs6_19276b6f-c2d1-44bd-8496-af548e90bab2/extractor/0.log" Apr 22 18:26:15.746963 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:15.746935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5v9l6/perf-node-gather-daemonset-fpdm4" Apr 22 18:26:17.176216 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.176189 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/kube-multus-additional-cni-plugins/0.log" Apr 22 18:26:17.201471 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.201446 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/egress-router-binary-copy/0.log" Apr 22 18:26:17.228111 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.228085 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/cni-plugins/0.log" Apr 22 18:26:17.252899 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.252880 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/bond-cni-plugin/0.log" Apr 22 18:26:17.276451 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.276427 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/routeoverride-cni/0.log" Apr 22 18:26:17.300554 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.300535 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/whereabouts-cni-bincopy/0.log" Apr 22 18:26:17.323627 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.323607 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fr8p8_94e54bd0-5757-4951-b6b5-0ae58070a297/whereabouts-cni/0.log" Apr 22 18:26:17.688819 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.688773 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dthlh_bc302022-8ada-4b98-9ede-97c644115fcc/kube-multus/0.log" Apr 22 18:26:17.876177 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.876152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zx8mc_7d04757d-f880-4389-bdfa-265c70b5d789/network-metrics-daemon/0.log" Apr 22 18:26:17.900007 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:17.899966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zx8mc_7d04757d-f880-4389-bdfa-265c70b5d789/kube-rbac-proxy/0.log" Apr 22 18:26:19.055763 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.055739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/ovn-controller/0.log" Apr 22 18:26:19.093172 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.093148 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/ovn-acl-logging/0.log" Apr 22 18:26:19.129905 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.129864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/kube-rbac-proxy-node/0.log" Apr 22 18:26:19.158695 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.158668 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:26:19.185126 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.185099 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/northd/0.log" Apr 22 18:26:19.213562 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.213519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/nbdb/0.log" Apr 22 18:26:19.240105 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.240080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/sbdb/0.log" Apr 22 18:26:19.335708 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:19.335616 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fnkv4_65f22fed-3bb7-44e7-912c-3b5733a79f43/ovnkube-controller/0.log" Apr 22 18:26:20.795925 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:20.795898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jftnj_5c9f6b6a-f8b1-47bc-b5a6-effd27fe2e81/network-check-target-container/0.log" Apr 22 18:26:21.704739 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:21.704709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-v7r7x_9815cd85-4c2f-43b2-97a8-f65c3f26db10/iptables-alerter/0.log" Apr 22 18:26:22.437638 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:22.437615 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9995t_41b3e8f6-4f69-4847-9f45-4c588f6bfd45/tuned/0.log" Apr 22 18:26:25.885000 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:25.884968 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pg5qk_5fecef3b-c5b2-49d1-8d71-e96e5433fb1a/csi-driver/0.log" Apr 22 18:26:25.907673 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:25.907637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pg5qk_5fecef3b-c5b2-49d1-8d71-e96e5433fb1a/csi-node-driver-registrar/0.log" Apr 22 18:26:25.931486 ip-10-0-128-248 kubenswrapper[2578]: I0422 18:26:25.931452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pg5qk_5fecef3b-c5b2-49d1-8d71-e96e5433fb1a/csi-liveness-probe/0.log"