Apr 22 17:34:22.751091 ip-10-0-130-38 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:34:23.173523 ip-10-0-130-38 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:23.173523 ip-10-0-130-38 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:34:23.173523 ip-10-0-130-38 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:23.173523 ip-10-0-130-38 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:34:23.173523 ip-10-0-130-38 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:34:23.177434 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.177348 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:34:23.182893 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182878 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:23.182893 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182893 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182897 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182901 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182904 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182907 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182912 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182916 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182919 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182923 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182925 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182928 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182931 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182933 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182936 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182939 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182942 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182944 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182947 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182949 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:23.182969 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182952 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182955 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182958 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182960 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182964 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182966 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182969 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182973 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182976 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182978 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182981 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182984 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182987 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182989 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182992 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182994 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182997 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.182999 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183002 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183004 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:23.183432 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183007 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183009 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183011 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183014 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183016 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183019 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183021 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183024 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183027 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183030 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183032 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183034 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183037 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183040 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183044 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183046 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183049 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183052 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183054 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183057 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:23.183944 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183060 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183062 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183065 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183067 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183070 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183074 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183077 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183079 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183082 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183086 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183090 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183093 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183095 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183098 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183101 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183103 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183106 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183108 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183111 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:23.184445 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183113 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183116 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183118 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183121 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183123 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183126 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183129 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183693 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183701 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183704 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183706 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183709 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183712 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183715 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183717 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183720 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183722 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183725 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183727 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183730 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:23.184924 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183733 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183735 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183738 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183740 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183743 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183745 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183748 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183751 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183753 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183756 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183758 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183760 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183763 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183765 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183768 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183770 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183773 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183776 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183778 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183781 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:23.185406 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183784 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183787 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183790 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183792 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183795 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183810 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183813 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183816 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183819 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183821 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183824 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183827 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183829 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183832 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183834 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183837 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183839 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183842 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183847 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183851 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:23.186068 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183854 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183857 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183860 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183863 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183866 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183869 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183872 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183874 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183877 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183880 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183883 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183887 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183889 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183893 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183895 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183898 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183900 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183904 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183907 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183909 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:23.186870 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183912 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183914 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183917 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183919 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183922 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183925 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183927 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183930 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183933 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183935 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183938 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183940 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.183943 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185128 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185136 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185143 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185148 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185153 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185156 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185160 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185165 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:34:23.187385 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185168 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185172 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185176 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185179 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185182 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185186 2565 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185189 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185192 2565 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185195 2565 flags.go:64] FLAG: --cloud-config="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185198 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185201 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185205 2565 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185208 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185211 2565 flags.go:64] FLAG: --config-dir="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185214 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185217 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185221 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185224 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185227 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185231 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185234 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185237 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185240 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185243 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185246 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:34:23.187913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185251 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185254 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185257 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185259 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185262 2565 flags.go:64] FLAG: --enable-server="true" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185265 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185269 2565 flags.go:64] FLAG: --event-burst="100" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185272 2565 flags.go:64] FLAG: --event-qps="50" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185276 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185279 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185282 2565 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185285 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185288 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185296 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185299 2565 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185302 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185305 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185308 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185311 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185314 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185317 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185319 2565 flags.go:64] FLAG: --feature-gates="" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185323 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185326 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185329 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:34:23.188522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185332 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185335 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185338 2565 flags.go:64] FLAG: --help="false" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185340 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185343 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185346 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185349 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185353 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185356 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185358 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185361 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185364 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185366 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185369 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185372 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185376 2565 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185379 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185382 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185385 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185388 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185391 2565 flags.go:64] FLAG: --lock-file="" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185394 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185397 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185400 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:34:23.189144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185406 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185409 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185412 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185415 2565 flags.go:64] FLAG: --logging-format="text" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185418 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185421 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185424 2565 flags.go:64] FLAG: --manifest-url="" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185426 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185431 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185434 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185438 2565 flags.go:64] FLAG: --max-pods="110" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185441 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185444 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185447 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185451 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185454 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185457 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185459 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185467 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185470 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185474 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185477 2565 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185480 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:34:23.189718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185486 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185489 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185492 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185495 2565 flags.go:64] FLAG: --port="10250" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185498 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185501 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-005e76b322ce6f1de" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185504 2565 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185508 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185511 2565 flags.go:64] FLAG: --register-node="true" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185514 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185517 2565 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185521 2565 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185524 2565 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185527 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185530 2565 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185533 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185536 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185539 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185542 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185545 2565 flags.go:64] FLAG: --runonce="false" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185547 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185550 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185553 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185556 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185559 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185562 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:34:23.190291 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185565 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185569 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185572 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185575 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185578 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185580 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185583 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185586 2565 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185589 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185595 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185597 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185600 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185604 2565 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185607 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185611 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185613 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185617 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185619 2565 flags.go:64] FLAG: --v="2" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185624 2565 flags.go:64] FLAG: --version="false" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185632 2565 flags.go:64] FLAG: --vmodule="" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185637 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.185640 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186091 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186112 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186120 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:23.190927 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186128 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186135 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186142 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186150 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186157 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186164 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186171 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186178 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186185 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186193 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186206 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186220 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186228 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186236 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186243 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186252 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186259 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186266 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186273 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186280 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:23.191545 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186287 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186292 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186302 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186306 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186310 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186315 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186319 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186323 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186327 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186331 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186336 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186340 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186344 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186348 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186352 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186361 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186365 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186370 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186374 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186378 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:23.192137 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186383 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186387 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186391 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186398 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186402 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186406 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186410 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186415 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186423 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186428 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186432 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186439 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186444 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186449 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186454 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186459 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186464 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186468 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186472 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:23.192658 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186477 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186487 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186491 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186497 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186501 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186505 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186509 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186513 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186518 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186522 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186526 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186530 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186534 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186540 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186548 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186552 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186558 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186564 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186568 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:23.193140 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186572 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:23.193609 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186578 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:23.193609 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186584 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:23.193609 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186587 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:23.193609 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.186592 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:23.193609 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.186599 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:23.193891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.193770 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:34:23.193929 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.193892 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193941 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193947 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193950 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193954 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193957 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193960 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:23.193961 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193963 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193966 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193969 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193972 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193975 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193978 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193980 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193983 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193986 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193989 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193991 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193994 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193996 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.193999 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194001 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194004 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194007 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194009 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194012 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194014 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:23.194139 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194017 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194019 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194022 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194025 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194028 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194031 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194034 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194036 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194039 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194041 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194044 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194046 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194049 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194051 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194055 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194058 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194061 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194063 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194066 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194068 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:23.194615 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194072 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194076 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194079 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194081 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194084 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194086 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194089 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194091 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194094 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194097 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194099 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194102 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194104 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194107 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194109 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194112 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194114 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194117 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194120 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194122 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:23.195110 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194125 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194127 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194130 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194132 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194135 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194137 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194141 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194144 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194147 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194150 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194153 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194156 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194160 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194164 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194167 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194170 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194173 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194177 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194180 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:23.195614 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194182 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.194188 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194285 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194290 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194292 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194295 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194298 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194300 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194303 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194305 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194308 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194311 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194314 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194317 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194319 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:34:23.196132 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194321 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194325 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194327 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194330 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194333 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194337 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194339 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194342 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194344 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194347 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194350 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194352 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194355 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194358 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194360 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194363 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194365 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194368 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194370 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194373 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:34:23.196515 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194376 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194378 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194381 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194383 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194386 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194388 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194391 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194393 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194396 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194398 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194401 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194403 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194406 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194408 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194411 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194413 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194416 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194419 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194422 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194424 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:34:23.197021 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194427 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194429 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194432 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194435 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194439 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194441 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194444 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194447 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194449 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194452 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194454 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194457 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194459 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194463 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194465 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194468 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194470 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194473 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194475 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:34:23.197512 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194479 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194482 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194485 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194488 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194491 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194494 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194497 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194500 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194503 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194506 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194508 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194511 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194514 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:23.194516 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.194521 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:34:23.198053 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.195225 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:34:23.198422 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.198265 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:34:23.199285 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.199274 2565 server.go:1019] "Starting client certificate rotation" Apr 22 17:34:23.199383 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.199368 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:23.199414 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.199400 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:34:23.224397 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.224379 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:23.227759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.227742 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:34:23.245182 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.245156 2565 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:34:23.250956 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.250939 2565 log.go:25] "Validated CRI v1 image API" Apr 22 17:34:23.252198 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.252181 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:34:23.256690 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.256673 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:23.257282 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.257264 2565 fs.go:135] Filesystem UUIDs: map[46a8219e-8a62-4f2e-bff6-cb7b0cd4a396:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a84c3bab-679c-4bd4-bd3b-53101f0ef9de:/dev/nvme0n1p4] Apr 22 17:34:23.257335 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.257282 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:34:23.263547 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.263430 2565 manager.go:217] Machine: {Timestamp:2026-04-22 17:34:23.261580782 +0000 UTC m=+0.398384150 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089403 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bb36c93d01a7e86e72ead92280e1c SystemUUID:ec2bb36c-93d0-1a7e-86e7-2ead92280e1c BootID:e69d9794-bcec-4eb5-b661-63a214d526c8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:ba:f8:6e:99 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:ba:f8:6e:99 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:93:30:94:f2:8b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:34:23.263547 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.263540 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:34:23.263675 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.263663 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:34:23.266410 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.266385 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:34:23.266551 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.266414 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-38.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:34:23.266598 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.266559 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:34:23.266598 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.266568 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:34:23.266598 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.266581 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:23.267308 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.267297 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:34:23.268638 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.268628 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:23.268747 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.268738 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:34:23.271233 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.271224 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:34:23.271272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.271237 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:34:23.271272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.271250 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:34:23.271272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.271258 2565 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:34:23.271272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.271266 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:34:23.272551 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.272539 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:23.272598 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.272556 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:34:23.275558 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.275531 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:34:23.277478 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.277446 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r6q6r" Apr 22 17:34:23.277685 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.277669 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:34:23.279532 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279509 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279561 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279569 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279574 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279580 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279586 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279592 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279598 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:34:23.279602 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279604 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:34:23.279826 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279610 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:34:23.279826 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279628 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:34:23.279826 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.279637 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:34:23.280659 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.280649 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:34:23.280659 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.280659 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:34:23.284252 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.284235 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:23.284405 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.284387 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:23.284517 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.284502 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:34:23.284583 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.284537 2565 server.go:1295] "Started kubelet" Apr 22 17:34:23.284898 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.284476 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:23.285042 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.285018 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:34:23.285094 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.285008 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:34:23.285094 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.285079 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:34:23.285430 ip-10-0-130-38 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:34:23.286313 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.286234 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:34:23.289297 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.289281 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:34:23.293532 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.293513 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:23.293923 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.293906 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:34:23.295123 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295102 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:34:23.295123 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295125 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:34:23.295333 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295320 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:34:23.295411 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295396 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:34:23.295411 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295410 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:34:23.295627 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.295608 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:23.295814 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295784 2565 factory.go:55] Registering systemd factory Apr 22 17:34:23.295899 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.295832 2565 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:34:23.295963 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.294962 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4690d4cafd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.284513533 +0000 UTC m=+0.421316901,LastTimestamp:2026-04-22 17:34:23.284513533 +0000 UTC m=+0.421316901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.296405 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296370 2565 factory.go:153] Registering CRI-O factory Apr 22 17:34:23.296405 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296388 2565 factory.go:223] Registration of the crio container factory successfully Apr 22 17:34:23.296503 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296443 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:34:23.296503 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296467 2565 factory.go:103] Registering Raw factory Apr 22 17:34:23.296503 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296482 2565 manager.go:1196] Started watching for new ooms in manager Apr 22 17:34:23.296715 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.296690 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:34:23.297012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.296995 2565 manager.go:319] Starting recovery of all containers Apr 22 17:34:23.298335 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.298308 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:23.298443 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.298384 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:34:23.307430 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.307414 2565 manager.go:324] Recovery completed Apr 22 17:34:23.311264 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.311252 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.314630 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.314614 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.314709 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.314648 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.314709 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.314663 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.315150 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.315135 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:34:23.315150 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.315148 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:34:23.315273 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.315163 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:34:23.316850 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.316768 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.317944 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.317931 2565 policy_none.go:49] "None policy: Start" Apr 22 17:34:23.318007 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.317948 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:34:23.318007 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.317958 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:34:23.326581 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.326519 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.334050 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.333991 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.355390 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355366 2565 manager.go:341] "Starting Device Plugin manager" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.355436 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355466 2565 server.go:85] "Starting device plugin registration server" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355682 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355691 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355765 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355890 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.355900 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.356702 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:34:23.367726 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.356742 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:23.369527 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.369465 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be46952fa2a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.357575845 +0000 UTC m=+0.494379198,LastTimestamp:2026-04-22 17:34:23.357575845 +0000 UTC m=+0.494379198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.450890 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.450810 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:34:23.452063 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.452046 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:34:23.452140 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.452078 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:34:23.452140 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.452100 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:34:23.452140 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.452112 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:34:23.452287 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.452211 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:34:23.456684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.456669 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.458754 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.458736 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.458844 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.458768 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.458844 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.458779 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.458844 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.458840 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.463649 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.463614 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 17:34:23.463737 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.463626 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.458755186 +0000 UTC m=+0.595558551,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.466986 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.466925 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.45877322 +0000 UTC m=+0.595576585,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.467083 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.467011 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.471979 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.471917 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.458782647 +0000 UTC m=+0.595586012,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.501959 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.501941 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Apr 22 17:34:23.553090 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.553064 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal"] Apr 22 17:34:23.553170 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.553134 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.554038 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.554025 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.554107 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.554051 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.554107 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.554061 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.555356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.555344 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.555496 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.555482 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.555534 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.555512 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.556086 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556070 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.556164 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556098 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.556164 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556110 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.556164 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556074 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.556300 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556178 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.556300 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.556196 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.557272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.557255 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.557358 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.557286 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.557953 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.557940 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.558038 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.557965 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.558038 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.557980 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.562813 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.562732 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.554038666 +0000 UTC m=+0.690842032,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.572613 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.572549 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.554055743 +0000 UTC m=+0.690859108,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.581972 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.581912 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.554064849 +0000 UTC m=+0.690868215,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.590578 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.590513 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.556087087 +0000 UTC m=+0.692890455,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.591622 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.591602 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.595966 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.595949 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.596943 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.596930 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.597012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.596951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c98156a065fc898aadcfd0c780a265db-config\") pod \"kube-apiserver-proxy-ip-10-0-130-38.ec2.internal\" (UID: \"c98156a065fc898aadcfd0c780a265db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.597012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.596968 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.603116 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.603053 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.556104644 +0000 UTC m=+0.692908010,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.614503 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.614439 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.556114171 +0000 UTC m=+0.692917537,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.623413 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.623356 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.556160469 +0000 UTC m=+0.692963839,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.631758 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.631701 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.556187186 +0000 UTC m=+0.692990553,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.639788 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.639719 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.55620187 +0000 UTC m=+0.693005237,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.649251 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.649194 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.557952453 +0000 UTC m=+0.694755826,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.659524 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.659465 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.557971657 +0000 UTC m=+0.694775024,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.667959 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.667942 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:23.668816 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.668784 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:23.668911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.668825 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:23.668911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.668839 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:23.668911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.668876 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.669312 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.669258 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.557986425 +0000 UTC m=+0.694789796,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.677138 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.677077 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:23.668812253 +0000 UTC m=+0.805615619,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.684750 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.684732 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.684820 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.684752 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:23.668830353 +0000 UTC m=+0.805633719,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.694057 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.694000 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0eaa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314668199 +0000 UTC m=+0.451471566,LastTimestamp:2026-04-22 17:34:23.668843065 +0000 UTC m=+0.805646431,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:23.697283 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697256 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.697351 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697294 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.697351 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697317 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c98156a065fc898aadcfd0c780a265db-config\") pod \"kube-apiserver-proxy-ip-10-0-130-38.ec2.internal\" (UID: \"c98156a065fc898aadcfd0c780a265db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.697351 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697347 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c98156a065fc898aadcfd0c780a265db-config\") pod \"kube-apiserver-proxy-ip-10-0-130-38.ec2.internal\" (UID: \"c98156a065fc898aadcfd0c780a265db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.697449 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.697449 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.697347 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3aad222c1b8d9495d15f07d93f8a83ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal\" (UID: \"3aad222c1b8d9495d15f07d93f8a83ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.895206 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.895117 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.896967 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:23.896933 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:23.912838 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:23.912811 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Apr 22 17:34:24.085554 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.085520 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:24.086495 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.086473 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:24.086625 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.086508 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:24.086625 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.086518 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:24.086625 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.086544 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:24.095097 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.095015 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a05830\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a05830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314630704 +0000 UTC m=+0.451434071,LastTimestamp:2026-04-22 17:34:24.086492804 +0000 UTC m=+1.223296170,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:24.097613 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.097542 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-130-38.ec2.internal.18a8be4692a0b63f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-38.ec2.internal.18a8be4692a0b63f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-38.ec2.internal,UID:ip-10-0-130-38.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-130-38.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:23.314654783 +0000 UTC m=+0.451458151,LastTimestamp:2026-04-22 17:34:24.086513124 +0000 UTC m=+1.223316489,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:24.097728 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.097672 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:24.194302 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.194231 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:24.257269 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.257225 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:24.293004 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.292976 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:24.335074 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:24.334842 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98156a065fc898aadcfd0c780a265db.slice/crio-5b8fb249815db12b1b3744fc8c0d4cfe17e5db833c656c812c694d5d568f4de5 WatchSource:0}: Error finding container 5b8fb249815db12b1b3744fc8c0d4cfe17e5db833c656c812c694d5d568f4de5: Status 404 returned error can't find the container with id 5b8fb249815db12b1b3744fc8c0d4cfe17e5db833c656c812c694d5d568f4de5 Apr 22 17:34:24.335311 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:24.335292 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aad222c1b8d9495d15f07d93f8a83ad.slice/crio-0dc10ea11206dcb1a06731874c7c491c3139f197a19563bfac36cfbfcd713ac9 WatchSource:0}: Error finding container 0dc10ea11206dcb1a06731874c7c491c3139f197a19563bfac36cfbfcd713ac9: Status 404 returned error can't find the container with id 0dc10ea11206dcb1a06731874c7c491c3139f197a19563bfac36cfbfcd713ac9 Apr 22 17:34:24.342754 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.342656 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:34:24.348411 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.348389 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 17:34:24.348411 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.348332 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be46cfea800d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\",Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:24.342900749 +0000 UTC m=+1.479704102,LastTimestamp:2026-04-22 17:34:24.342900749 +0000 UTC m=+1.479704102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:24.350360 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.350297 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-130-38.ec2.internal.18a8be46cff08f38 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-130-38.ec2.internal,UID:c98156a065fc898aadcfd0c780a265db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\",Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:24.343297848 +0000 UTC m=+1.480101206,LastTimestamp:2026-04-22 17:34:24.343297848 +0000 UTC m=+1.480101206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:24.454780 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.454694 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" event={"ID":"c98156a065fc898aadcfd0c780a265db","Type":"ContainerStarted","Data":"5b8fb249815db12b1b3744fc8c0d4cfe17e5db833c656c812c694d5d568f4de5"} Apr 22 17:34:24.455638 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.455619 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerStarted","Data":"0dc10ea11206dcb1a06731874c7c491c3139f197a19563bfac36cfbfcd713ac9"} Apr 22 17:34:24.582172 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.582138 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:24.722305 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.722229 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Apr 22 17:34:24.898475 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.898446 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:24.899767 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.899743 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:24.899891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.899786 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:24.899891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.899817 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:24.899891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:24.899853 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:24.920601 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:24.920555 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:25.295894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:25.295858 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:25.989735 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:25.989710 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 17:34:25.989867 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:25.989760 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-130-38.ec2.internal.18a8be4731828fdb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-130-38.ec2.internal,UID:c98156a065fc898aadcfd0c780a265db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\" in 1.636s (1.636s including waiting). Image size: 488332864 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:25.980256219 +0000 UTC m=+3.117059572,LastTimestamp:2026-04-22 17:34:25.980256219 +0000 UTC m=+3.117059572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:25.997757 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:25.997687 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be47319368c2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" in 1.638s (1.638s including waiting). Image size: 468435751 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:25.981360322 +0000 UTC m=+3.118163681,LastTimestamp:2026-04-22 17:34:25.981360322 +0000 UTC m=+3.118163681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:26.048768 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.048698 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-130-38.ec2.internal.18a8be4735253e9f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-130-38.ec2.internal,UID:c98156a065fc898aadcfd0c780a265db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Created,Message:Created container: haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:26.041249439 +0000 UTC m=+3.178052805,LastTimestamp:2026-04-22 17:34:26.041249439 +0000 UTC m=+3.178052805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:26.058051 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.057969 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-130-38.ec2.internal.18a8be4735819109 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-130-38.ec2.internal,UID:c98156a065fc898aadcfd0c780a265db,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Started,Message:Started container haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:26.047299849 +0000 UTC m=+3.184103222,LastTimestamp:2026-04-22 17:34:26.047299849 +0000 UTC m=+3.184103222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:26.293270 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.293201 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:26.332117 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.332087 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Apr 22 17:34:26.425221 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.425192 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:26.458725 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.458689 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" event={"ID":"c98156a065fc898aadcfd0c780a265db","Type":"ContainerStarted","Data":"4a61077fa875fe85e7c51b12f48ef9cc7301fde279b7655bec32d68da7696171"} Apr 22 17:34:26.458852 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.458751 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:26.459716 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.459700 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:26.459819 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.459729 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:26.459819 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.459739 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:26.459911 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.459899 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:26.484606 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.484536 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be474f089489 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:26.475578505 +0000 UTC m=+3.612381874,LastTimestamp:2026-04-22 17:34:26.475578505 +0000 UTC m=+3.612381874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:26.492840 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.492750 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be474f7997fd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:26.482984957 +0000 UTC m=+3.619788326,LastTimestamp:2026-04-22 17:34:26.482984957 +0000 UTC m=+3.619788326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:26.521169 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.521145 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:26.522117 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.522101 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:26.522218 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.522135 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:26.522218 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.522149 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:26.522218 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:26.522185 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:26.537588 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.537566 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:26.884299 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:26.884267 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:27.213005 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.212928 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:27.292993 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.292965 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:27.461159 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.461105 2565 generic.go:358] "Generic (PLEG): container finished" podID="3aad222c1b8d9495d15f07d93f8a83ad" containerID="a446ca312e12099d5401e68447d84e98e2ec5c038b72c0ddc637265f10bcc909" exitCode=0 Apr 22 17:34:27.461586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.461199 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.461586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.461196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerDied","Data":"a446ca312e12099d5401e68447d84e98e2ec5c038b72c0ddc637265f10bcc909"} Apr 22 17:34:27.461586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.461212 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:27.462089 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462071 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.462089 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462082 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:27.462222 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462104 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.462222 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462117 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.462222 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462104 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:27.462222 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:27.462193 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:27.462347 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.462261 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:27.462434 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.462419 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:27.473099 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.473023 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4789f743db openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.464299483 +0000 UTC m=+4.601102857,LastTimestamp:2026-04-22 17:34:27.464299483 +0000 UTC m=+4.601102857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:27.572297 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.572198 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be478fc9b8b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.561978041 +0000 UTC m=+4.698781417,LastTimestamp:2026-04-22 17:34:27.561978041 +0000 UTC m=+4.698781417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:27.580537 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:27.580443 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4790371d5d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.569147229 +0000 UTC m=+4.705950594,LastTimestamp:2026-04-22 17:34:27.569147229 +0000 UTC m=+4.705950594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:28.294307 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.294280 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:28.464102 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.464076 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/0.log" Apr 22 17:34:28.464471 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.464389 2565 generic.go:358] "Generic (PLEG): container finished" podID="3aad222c1b8d9495d15f07d93f8a83ad" containerID="5e8bd6a228e1db5dd5fc425f0e5353554370de5959a89d18ac83f75f1988a0fa" exitCode=1 Apr 22 17:34:28.464471 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.464418 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerDied","Data":"5e8bd6a228e1db5dd5fc425f0e5353554370de5959a89d18ac83f75f1988a0fa"} Apr 22 17:34:28.464471 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.464466 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:28.465347 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.465332 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:28.465452 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.465356 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:28.465452 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.465366 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:28.465626 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:28.465614 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:28.465669 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:28.465660 2565 scope.go:117] "RemoveContainer" containerID="5e8bd6a228e1db5dd5fc425f0e5353554370de5959a89d18ac83f75f1988a0fa" Apr 22 17:34:28.478141 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:28.478057 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4789f743db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4789f743db openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.464299483 +0000 UTC m=+4.601102857,LastTimestamp:2026-04-22 17:34:28.46744369 +0000 UTC m=+5.604247065,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:28.570215 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:28.570113 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be478fc9b8b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be478fc9b8b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.561978041 +0000 UTC m=+4.698781417,LastTimestamp:2026-04-22 17:34:28.562132542 +0000 UTC m=+5.698935919,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:28.578333 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:28.578262 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4790371d5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be4790371d5d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:27.569147229 +0000 UTC m=+4.705950594,LastTimestamp:2026-04-22 17:34:28.569050945 +0000 UTC m=+5.705854319,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:29.294019 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.293994 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:29.468282 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.468256 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/1.log" Apr 22 17:34:29.468648 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.468632 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/0.log" Apr 22 17:34:29.468952 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.468929 2565 generic.go:358] "Generic (PLEG): container finished" podID="3aad222c1b8d9495d15f07d93f8a83ad" containerID="d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a" exitCode=1 Apr 22 17:34:29.469003 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.468967 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerDied","Data":"d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a"} Apr 22 17:34:29.469003 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.468999 2565 scope.go:117] "RemoveContainer" containerID="5e8bd6a228e1db5dd5fc425f0e5353554370de5959a89d18ac83f75f1988a0fa" Apr 22 17:34:29.469057 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.469015 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:29.470185 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.470168 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:29.470259 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.470199 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:29.470259 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.470209 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:29.470444 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:29.470432 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:29.470488 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.470480 2565 scope.go:117] "RemoveContainer" containerID="d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a" Apr 22 17:34:29.471012 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:29.470995 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podUID="3aad222c1b8d9495d15f07d93f8a83ad" Apr 22 17:34:29.478977 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:29.478887 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be48018c9c21 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad),Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:29.470575649 +0000 UTC m=+6.607379021,LastTimestamp:2026-04-22 17:34:29.470575649 +0000 UTC m=+6.607379021,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:29.541350 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:29.541315 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Apr 22 17:34:29.738475 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.738393 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:29.739557 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.739540 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:29.739637 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.739586 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:29.739637 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.739596 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:29.739637 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:29.739623 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:29.758503 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:29.758481 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:30.298965 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.298940 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:30.471951 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.471926 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/1.log" Apr 22 17:34:30.472335 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.472320 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:30.473314 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.473299 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:30.473369 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.473328 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:30.473369 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.473337 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:30.473535 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:30.473523 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:30.473578 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:30.473569 2565 scope.go:117] "RemoveContainer" containerID="d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a" Apr 22 17:34:30.473700 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:30.473687 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podUID="3aad222c1b8d9495d15f07d93f8a83ad" Apr 22 17:34:30.486172 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:30.486102 2565 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be48018c9c21\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal.18a8be48018c9c21 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal,UID:3aad222c1b8d9495d15f07d93f8a83ad,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad),Source:EventSource{Component:kubelet,Host:ip-10-0-130-38.ec2.internal,},FirstTimestamp:2026-04-22 17:34:29.470575649 +0000 UTC m=+6.607379021,LastTimestamp:2026-04-22 17:34:30.473659223 +0000 UTC m=+7.610462589,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-38.ec2.internal,}" Apr 22 17:34:30.943212 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:30.943171 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 17:34:31.304751 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:31.304677 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:31.607782 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:31.607694 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:34:31.607782 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:31.607694 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:34:31.785565 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:31.785535 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:34:32.298358 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:32.298316 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:33.295036 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:33.295005 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:33.357559 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:33.357521 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:34.292913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:34.292879 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:35.293323 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:35.293289 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:35.953815 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:35.953770 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 22 17:34:36.159237 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.159206 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:36.160509 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.160493 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:36.160614 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.160523 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:36.160614 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.160534 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:36.160614 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.160562 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:36.179812 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:36.179785 2565 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-130-38.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:36.296002 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:36.295931 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:37.293227 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:37.293191 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:38.292658 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:38.292625 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:39.295005 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:39.294978 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-38.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:34:39.641092 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:39.640993 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r6q6r" Apr 22 17:34:40.199216 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.199180 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:34:40.299359 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.299332 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.314300 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.314277 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.378068 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.378041 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.641726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.641627 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:29:39 +0000 UTC" deadline="2027-10-05 03:52:30.744846905 +0000 UTC" Apr 22 17:34:40.641726 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.641668 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12730h17m50.103181947s" Apr 22 17:34:40.642588 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.642576 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.642635 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:40.642594 2565 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.681290 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.681268 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.698879 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.698863 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:40.755538 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:40.755511 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.019717 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.019637 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.019717 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:41.019664 2565 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.288015 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.287936 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.303335 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.303311 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.362668 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.362647 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.629979 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.629902 2565 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.629979 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:41.629928 2565 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-130-38.ec2.internal" not found Apr 22 17:34:41.954724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:41.954656 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:42.240059 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.239982 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:42.452705 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.452673 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:42.454757 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.454740 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:42.454878 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.454768 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:42.454878 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.454781 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:42.455013 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:42.455000 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:42.455100 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:42.455091 2565 scope.go:117] "RemoveContainer" containerID="d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a" Apr 22 17:34:42.959304 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:42.959224 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:43.180179 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.180138 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.181881 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.181862 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.181981 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.181904 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.181981 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.181916 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.181981 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.181951 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:43.191522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.191498 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:43.191574 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.191521 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-38.ec2.internal\": node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.219694 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.219635 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.234939 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.234922 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:43.303479 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.303451 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:34:43.319793 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.319764 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.330935 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.330910 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:34:43.357999 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.357969 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.383290 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.383263 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w7bpw" Apr 22 17:34:43.394901 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.394880 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w7bpw" Apr 22 17:34:43.420316 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.420289 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.491881 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.491820 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:34:43.492211 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.492177 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/1.log" Apr 22 17:34:43.492501 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.492481 2565 generic.go:358] "Generic (PLEG): container finished" podID="3aad222c1b8d9495d15f07d93f8a83ad" containerID="d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77" exitCode=1 Apr 22 17:34:43.492561 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.492511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerDied","Data":"d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77"} Apr 22 17:34:43.492561 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.492536 2565 scope.go:117] "RemoveContainer" containerID="d8af99d921481905703090dba4a68d30f78f76a8811f5b6a0895d78f75db343a" Apr 22 17:34:43.492630 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.492620 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:34:43.493658 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.493439 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:34:43.493658 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.493473 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:34:43.493658 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.493491 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:34:43.493832 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.493723 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-38.ec2.internal\" not found" node="ip-10-0-130-38.ec2.internal" Apr 22 17:34:43.493832 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:43.493763 2565 scope.go:117] "RemoveContainer" containerID="d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77" Apr 22 17:34:43.493935 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.493913 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podUID="3aad222c1b8d9495d15f07d93f8a83ad" Apr 22 17:34:43.520719 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.520698 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.621004 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.620958 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.721481 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.721456 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.822231 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.822154 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:43.922641 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:43.922616 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:44.023146 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.023121 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-38.ec2.internal\" not found" Apr 22 17:34:44.096219 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.096043 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:34:44.195992 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.195956 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" Apr 22 17:34:44.220401 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.220379 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:44.220498 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.220487 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" Apr 22 17:34:44.232756 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.232733 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:34:44.283030 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.282997 2565 apiserver.go:52] "Watching apiserver" Apr 22 17:34:44.290613 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.290593 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:34:44.290940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.290921 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-vs8nl","openshift-network-operator/iptables-alerter-r7w4r","kube-system/konnectivity-agent-mtlqw","kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6","openshift-cluster-node-tuning-operator/tuned-tb9hn","openshift-dns/node-resolver-kzckr","openshift-multus/multus-9s5s7","openshift-multus/network-metrics-daemon-wqnlm","openshift-ovn-kubernetes/ovnkube-node-t45c7","openshift-image-registry/node-ca-qkkrr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal","openshift-multus/multus-additional-cni-plugins-k2j7l"] Apr 22 17:34:44.292485 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.292469 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.293422 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.293403 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.294450 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.294432 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.295390 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.295376 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.296295 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296279 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:34:44.296372 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296305 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pfrnz\"" Apr 22 17:34:44.296424 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296376 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.296626 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296609 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:34:44.296774 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296751 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.296881 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296863 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:34:44.296987 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296957 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.297080 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297018 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:34:44.297080 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297025 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.297080 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.296967 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2jlgb\"" Apr 22 17:34:44.297080 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297025 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wmgt6\"" Apr 22 17:34:44.297309 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297294 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:34:44.297356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297346 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:34:44.297399 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297382 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.297449 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297405 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:34:44.297449 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297426 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:34:44.297534 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297408 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.297534 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297497 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.297625 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297610 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wlfzg\"" Apr 22 17:34:44.297718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.297703 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.298117 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.298102 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.298753 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.298735 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.298877 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.298861 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.301079 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301059 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9x4kg\"" Apr 22 17:34:44.301163 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301069 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.301163 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301148 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:44.301255 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.301217 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:44.301255 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.301217 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:44.301749 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301733 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.301969 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301955 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9sk4w\"" Apr 22 17:34:44.302041 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.301975 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.302090 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302056 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:34:44.302241 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302227 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.302389 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302372 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.302652 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302634 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.302652 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302647 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:34:44.302762 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.302741 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jcmgh\"" Apr 22 17:34:44.303573 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.303556 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.304451 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.304434 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:34:44.304645 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.304631 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:34:44.305330 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.305313 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:34:44.305418 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.305380 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kc7pm\"" Apr 22 17:34:44.305480 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.305464 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-b9z47\"" Apr 22 17:34:44.305480 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.305475 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:34:44.305748 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.305730 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:34:44.307967 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.307949 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-netns\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308045 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.307979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-node-log\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308111 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308075 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-bin\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308167 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308116 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308167 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308153 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-env-overrides\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308259 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308181 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4c5994d9-36a8-4d98-a1d9-67f743ccd167-agent-certs\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.308259 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308351 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308262 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-systemd-units\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.308351 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308312 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963159e2-ff9f-47ac-9907-a22d03154537-host-slash\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.308439 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-socket-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.308439 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308426 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-registration-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.308512 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkvlz\" (UniqueName: \"kubernetes.io/projected/e451b16e-1695-40d9-83f0-1c9b48091c09-kube-api-access-nkvlz\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.308512 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-run\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308606 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308548 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-sys\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308657 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-lib-modules\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308657 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308643 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963159e2-ff9f-47ac-9907-a22d03154537-iptables-alerter-script\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.308747 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308687 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4c5994d9-36a8-4d98-a1d9-67f743ccd167-konnectivity-ca\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.308747 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.308747 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308736 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysconfig\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308760 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-host\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308787 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-tuned\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308830 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-tmp\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308855 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fdfv\" (UniqueName: \"kubernetes.io/projected/8ce4426c-b706-426d-b90e-58d697c2b5b4-kube-api-access-5fdfv\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.308894 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308876 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309121 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.308973 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-script-lib\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309121 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309020 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqtt\" (UniqueName: \"kubernetes.io/projected/0a131c7a-9e60-4af2-9909-c14202f6723e-kube-api-access-4rqtt\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309121 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309044 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.309121 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309071 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-sys-fs\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.309121 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309109 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-systemd\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309138 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-systemd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309162 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-etc-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309185 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-ovn\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309245 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-log-socket\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-config\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309311 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zb68\" (UniqueName: \"kubernetes.io/projected/963159e2-ff9f-47ac-9907-a22d03154537-kube-api-access-7zb68\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309366 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-netd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-device-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-conf\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309423 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-slash\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309445 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-var-lib-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309461 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-modprobe-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309479 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-kubernetes\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309497 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-kubelet\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a131c7a-9e60-4af2-9909-c14202f6723e-ovn-node-metrics-cert\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.309586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.309557 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-var-lib-kubelet\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.395770 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.395692 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:43 +0000 UTC" deadline="2027-09-27 08:11:39.235867288 +0000 UTC" Apr 22 17:34:44.395770 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.395722 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12542h36m54.840147596s" Apr 22 17:34:44.396348 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.396332 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:34:44.409912 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409892 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqtt\" (UniqueName: \"kubernetes.io/projected/0a131c7a-9e60-4af2-9909-c14202f6723e-kube-api-access-4rqtt\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.409987 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409917 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.409987 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409939 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4h86\" (UniqueName: \"kubernetes.io/projected/bf130287-76e7-42eb-abee-d3c9dae69a49-kube-api-access-f4h86\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.409987 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409956 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-k8s-cni-cncf-io\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.409987 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409973 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-multus\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.409997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-systemd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410019 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-systemd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410058 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-etc-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410020 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-etc-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-log-socket\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-config\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-log-socket\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410124 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-conf-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410144 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-device-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410161 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-conf\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410177 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngz6d\" (UniqueName: \"kubernetes.io/projected/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-kube-api-access-ngz6d\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410196 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-modprobe-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410216 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-etc-kubernetes\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410219 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-device-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410232 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-var-lib-kubelet\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410286 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-tmp-dir\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-conf\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410319 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-daemon-config\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410350 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-var-lib-kubelet\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410353 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-node-log\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-node-log\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410407 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-bin\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410404 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-modprobe-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-env-overrides\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.410537 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410486 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-run\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410496 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-bin\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410512 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf130287-76e7-42eb-abee-d3c9dae69a49-serviceca\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410533 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-run\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410577 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysctl-d\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410571 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-systemd-units\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410603 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-config\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410624 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-systemd-units\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963159e2-ff9f-47ac-9907-a22d03154537-host-slash\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410658 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-socket-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410684 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkvlz\" (UniqueName: \"kubernetes.io/projected/e451b16e-1695-40d9-83f0-1c9b48091c09-kube-api-access-nkvlz\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/963159e2-ff9f-47ac-9907-a22d03154537-host-slash\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410709 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-tuned\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410738 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxfp\" (UniqueName: \"kubernetes.io/projected/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-kube-api-access-vbxfp\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410764 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.411272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410790 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963159e2-ff9f-47ac-9907-a22d03154537-iptables-alerter-script\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410834 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4c5994d9-36a8-4d98-a1d9-67f743ccd167-konnectivity-ca\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410894 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-socket-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysconfig\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410926 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-env-overrides\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410940 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-cnibin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410983 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-sysconfig\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.410984 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-socket-dir-parent\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-script-lib\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411052 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-sys-fs\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411079 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-systemd\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411108 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cnibin\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411118 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-sys-fs\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-multus-certs\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411159 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-systemd\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411161 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-ovn\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411198 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zb68\" (UniqueName: \"kubernetes.io/projected/963159e2-ff9f-47ac-9907-a22d03154537-kube-api-access-7zb68\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.412012 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-ovn\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411223 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-system-cni-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411248 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-os-release\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411267 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-hosts-file\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411285 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-system-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411309 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8k6n\" (UniqueName: \"kubernetes.io/projected/78349f6c-83ce-40df-b593-af2886a0a4aa-kube-api-access-z8k6n\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-netd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411336 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/963159e2-ff9f-47ac-9907-a22d03154537-iptables-alerter-script\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411357 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411370 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-cni-netd\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4c5994d9-36a8-4d98-a1d9-67f743ccd167-konnectivity-ca\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-cni-binary-copy\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-slash\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-var-lib-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411521 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-kubernetes\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf130287-76e7-42eb-abee-d3c9dae69a49-host\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.412664 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411553 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-var-lib-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411559 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0a131c7a-9e60-4af2-9909-c14202f6723e-ovnkube-script-lib\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-slash\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411576 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-kubernetes\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411575 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-kubelet\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411601 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411610 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-kubelet\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a131c7a-9e60-4af2-9909-c14202f6723e-ovn-node-metrics-cert\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411647 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411654 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411695 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-kubelet\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411716 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-hostroot\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-netns\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411754 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411771 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4c5994d9-36a8-4d98-a1d9-67f743ccd167-agent-certs\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-sys\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413231 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411821 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-run-netns\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411832 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-lib-modules\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411860 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411883 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-sys\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-registration-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.411944 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-tmp\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412024 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lsb\" (UniqueName: \"kubernetes.io/projected/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-kube-api-access-v2lsb\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412034 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-registration-dir\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412051 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-netns\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412088 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-bin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-lib-modules\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412211 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-host\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412222 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e451b16e-1695-40d9-83f0-1c9b48091c09-etc-selinux\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fdfv\" (UniqueName: \"kubernetes.io/projected/8ce4426c-b706-426d-b90e-58d697c2b5b4-kube-api-access-5fdfv\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ce4426c-b706-426d-b90e-58d697c2b5b4-host\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.413684 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412281 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.414157 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412322 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-os-release\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.414157 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.414157 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.412421 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0a131c7a-9e60-4af2-9909-c14202f6723e-run-openvswitch\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.416081 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.416061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-etc-tuned\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.416170 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.416084 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ce4426c-b706-426d-b90e-58d697c2b5b4-tmp\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.416288 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.416272 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4c5994d9-36a8-4d98-a1d9-67f743ccd167-agent-certs\") pod \"konnectivity-agent-mtlqw\" (UID: \"4c5994d9-36a8-4d98-a1d9-67f743ccd167\") " pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.416327 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.416289 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0a131c7a-9e60-4af2-9909-c14202f6723e-ovn-node-metrics-cert\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.418219 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.418200 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqtt\" (UniqueName: \"kubernetes.io/projected/0a131c7a-9e60-4af2-9909-c14202f6723e-kube-api-access-4rqtt\") pod \"ovnkube-node-t45c7\" (UID: \"0a131c7a-9e60-4af2-9909-c14202f6723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.418506 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.418476 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkvlz\" (UniqueName: \"kubernetes.io/projected/e451b16e-1695-40d9-83f0-1c9b48091c09-kube-api-access-nkvlz\") pod \"aws-ebs-csi-driver-node-rbvw6\" (UID: \"e451b16e-1695-40d9-83f0-1c9b48091c09\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.420545 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.420524 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fdfv\" (UniqueName: \"kubernetes.io/projected/8ce4426c-b706-426d-b90e-58d697c2b5b4-kube-api-access-5fdfv\") pod \"tuned-tb9hn\" (UID: \"8ce4426c-b706-426d-b90e-58d697c2b5b4\") " pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.420755 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.420739 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zb68\" (UniqueName: \"kubernetes.io/projected/963159e2-ff9f-47ac-9907-a22d03154537-kube-api-access-7zb68\") pod \"iptables-alerter-r7w4r\" (UID: \"963159e2-ff9f-47ac-9907-a22d03154537\") " pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.472957 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.472912 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-38.ec2.internal" podStartSLOduration=0.472900375 podStartE2EDuration="472.900375ms" podCreationTimestamp="2026-04-22 17:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:34:44.472862111 +0000 UTC m=+21.609665482" watchObservedRunningTime="2026-04-22 17:34:44.472900375 +0000 UTC m=+21.609703727" Apr 22 17:34:44.494743 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.494722 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:34:44.495358 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.495343 2565 scope.go:117] "RemoveContainer" containerID="d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77" Apr 22 17:34:44.495503 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.495488 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podUID="3aad222c1b8d9495d15f07d93f8a83ad" Apr 22 17:34:44.513149 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513130 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.513196 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513155 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513196 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-kubelet\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513196 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-hostroot\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513281 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513242 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-kubelet\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513322 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513297 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.513368 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513335 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2lsb\" (UniqueName: \"kubernetes.io/projected/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-kube-api-access-v2lsb\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.513368 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513357 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513308 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-hostroot\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-netns\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513409 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-netns\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513411 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-bin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.513458 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513446 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-os-release\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4h86\" (UniqueName: \"kubernetes.io/projected/bf130287-76e7-42eb-abee-d3c9dae69a49-kube-api-access-f4h86\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-bin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513496 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-k8s-cni-cncf-io\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513533 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-multus\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-conf-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513572 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-os-release\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513586 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngz6d\" (UniqueName: \"kubernetes.io/projected/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-kube-api-access-ngz6d\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513591 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-var-lib-cni-multus\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-etc-kubernetes\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513630 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-k8s-cni-cncf-io\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-tmp-dir\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513643 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-conf-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-daemon-config\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-etc-kubernetes\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf130287-76e7-42eb-abee-d3c9dae69a49-serviceca\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.513724 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513722 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513735 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-binary-copy\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513753 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxfp\" (UniqueName: \"kubernetes.io/projected/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-kube-api-access-vbxfp\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513855 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513945 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-tmp-dir\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.513978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514013 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-cnibin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-socket-dir-parent\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cnibin\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514091 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-multus-certs\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.514104 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514120 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-system-cni-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514119 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-cnibin\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514104 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-cnibin\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514146 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-host-run-multus-certs\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514152 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-os-release\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514183 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-system-cni-dir\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514426 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.514207 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.014152422 +0000 UTC m=+22.150955796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514240 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-os-release\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514246 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-daemon-config\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-hosts-file\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-system-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514294 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-multus-socket-dir-parent\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514302 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8k6n\" (UniqueName: \"kubernetes.io/projected/78349f6c-83ce-40df-b593-af2886a0a4aa-kube-api-access-z8k6n\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514330 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-hosts-file\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514361 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/78349f6c-83ce-40df-b593-af2886a0a4aa-system-cni-dir\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514383 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-cni-binary-copy\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf130287-76e7-42eb-abee-d3c9dae69a49-host\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514472 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf130287-76e7-42eb-abee-d3c9dae69a49-host\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514613 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514776 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/78349f6c-83ce-40df-b593-af2886a0a4aa-cni-binary-copy\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.514940 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.514876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bf130287-76e7-42eb-abee-d3c9dae69a49-serviceca\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.534357 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.534327 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:44.534357 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.534350 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:44.534514 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.534364 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:44.534514 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:44.534426 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:45.034408711 +0000 UTC m=+22.171212067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:44.534514 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.534423 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2lsb\" (UniqueName: \"kubernetes.io/projected/7df0412a-fc05-4359-8ed3-f1f7cc48a8eb-kube-api-access-v2lsb\") pod \"multus-additional-cni-plugins-k2j7l\" (UID: \"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb\") " pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.535485 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.535462 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxfp\" (UniqueName: \"kubernetes.io/projected/fc09482b-e40a-4db6-b3ae-ee8c7655cc83-kube-api-access-vbxfp\") pod \"node-resolver-kzckr\" (UID: \"fc09482b-e40a-4db6-b3ae-ee8c7655cc83\") " pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.535587 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.535463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngz6d\" (UniqueName: \"kubernetes.io/projected/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-kube-api-access-ngz6d\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:44.536634 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.536611 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4h86\" (UniqueName: \"kubernetes.io/projected/bf130287-76e7-42eb-abee-d3c9dae69a49-kube-api-access-f4h86\") pod \"node-ca-qkkrr\" (UID: \"bf130287-76e7-42eb-abee-d3c9dae69a49\") " pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.536848 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.536831 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8k6n\" (UniqueName: \"kubernetes.io/projected/78349f6c-83ce-40df-b593-af2886a0a4aa-kube-api-access-z8k6n\") pod \"multus-9s5s7\" (UID: \"78349f6c-83ce-40df-b593-af2886a0a4aa\") " pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.603177 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.603154 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:34:44.608820 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.608775 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r7w4r" Apr 22 17:34:44.609161 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.609140 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a131c7a_9e60_4af2_9909_c14202f6723e.slice/crio-5ccabeaddeb242e88fc0fbdb080ca169d25a3b2dd46b34d88ce80d540507cadf WatchSource:0}: Error finding container 5ccabeaddeb242e88fc0fbdb080ca169d25a3b2dd46b34d88ce80d540507cadf: Status 404 returned error can't find the container with id 5ccabeaddeb242e88fc0fbdb080ca169d25a3b2dd46b34d88ce80d540507cadf Apr 22 17:34:44.614287 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.614261 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963159e2_ff9f_47ac_9907_a22d03154537.slice/crio-41a4a346399fa22241fec8eb9552577852b66b2d188359800fab284035bb634c WatchSource:0}: Error finding container 41a4a346399fa22241fec8eb9552577852b66b2d188359800fab284035bb634c: Status 404 returned error can't find the container with id 41a4a346399fa22241fec8eb9552577852b66b2d188359800fab284035bb634c Apr 22 17:34:44.614825 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.614787 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:34:44.618714 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.618697 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" Apr 22 17:34:44.623018 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.622993 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5994d9_36a8_4d98_a1d9_67f743ccd167.slice/crio-2cee6e0999df5143af95f0fc9477c51f77f1b23b0f5b18c1884395e2952489cb WatchSource:0}: Error finding container 2cee6e0999df5143af95f0fc9477c51f77f1b23b0f5b18c1884395e2952489cb: Status 404 returned error can't find the container with id 2cee6e0999df5143af95f0fc9477c51f77f1b23b0f5b18c1884395e2952489cb Apr 22 17:34:44.624818 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.624787 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" Apr 22 17:34:44.629714 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.629692 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode451b16e_1695_40d9_83f0_1c9b48091c09.slice/crio-e9152e54a395fc8dfa1a3633464d8b1bea5ef231fcde4d380c836c89695772ff WatchSource:0}: Error finding container e9152e54a395fc8dfa1a3633464d8b1bea5ef231fcde4d380c836c89695772ff: Status 404 returned error can't find the container with id e9152e54a395fc8dfa1a3633464d8b1bea5ef231fcde4d380c836c89695772ff Apr 22 17:34:44.629929 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.629912 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kzckr" Apr 22 17:34:44.633899 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.633874 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce4426c_b706_426d_b90e_58d697c2b5b4.slice/crio-6d6e87422e06b795c5beefac385d884e9004eaf2113c6898fe6d04dcaf4da145 WatchSource:0}: Error finding container 6d6e87422e06b795c5beefac385d884e9004eaf2113c6898fe6d04dcaf4da145: Status 404 returned error can't find the container with id 6d6e87422e06b795c5beefac385d884e9004eaf2113c6898fe6d04dcaf4da145 Apr 22 17:34:44.634834 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.634817 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9s5s7" Apr 22 17:34:44.639903 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.639840 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qkkrr" Apr 22 17:34:44.642558 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.642537 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78349f6c_83ce_40df_b593_af2886a0a4aa.slice/crio-c1a3dd0d50342571ad3e60656eadfd54c2de3398bcb905dca5b8e0a1458711dd WatchSource:0}: Error finding container c1a3dd0d50342571ad3e60656eadfd54c2de3398bcb905dca5b8e0a1458711dd: Status 404 returned error can't find the container with id c1a3dd0d50342571ad3e60656eadfd54c2de3398bcb905dca5b8e0a1458711dd Apr 22 17:34:44.644154 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:44.644136 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" Apr 22 17:34:44.647757 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.647709 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf130287_76e7_42eb_abee_d3c9dae69a49.slice/crio-569a7c1478276916a91b0078efaea4a87ffb57975a74ee34a72b2eea824d3cac WatchSource:0}: Error finding container 569a7c1478276916a91b0078efaea4a87ffb57975a74ee34a72b2eea824d3cac: Status 404 returned error can't find the container with id 569a7c1478276916a91b0078efaea4a87ffb57975a74ee34a72b2eea824d3cac Apr 22 17:34:44.651302 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:34:44.651276 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df0412a_fc05_4359_8ed3_f1f7cc48a8eb.slice/crio-bafb4af62e3394cf421fad38ad91e8204ce64ea3ffdc9ed5859e000d26d13c08 WatchSource:0}: Error finding container bafb4af62e3394cf421fad38ad91e8204ce64ea3ffdc9ed5859e000d26d13c08: Status 404 returned error can't find the container with id bafb4af62e3394cf421fad38ad91e8204ce64ea3ffdc9ed5859e000d26d13c08 Apr 22 17:34:45.018226 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.018146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:45.018867 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.018378 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.018867 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.018456 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:46.018436461 +0000 UTC m=+23.155239828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:45.118738 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.118699 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:45.118941 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.118889 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:45.118941 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.118916 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:45.118941 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.118929 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.119105 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:45.118991 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:46.118970961 +0000 UTC m=+23.255774317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:45.396597 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.396482 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:29:43 +0000 UTC" deadline="2027-11-12 23:54:10.28234971 +0000 UTC" Apr 22 17:34:45.396597 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.396525 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13662h19m24.885829169s" Apr 22 17:34:45.500153 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.500086 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9s5s7" event={"ID":"78349f6c-83ce-40df-b593-af2886a0a4aa","Type":"ContainerStarted","Data":"c1a3dd0d50342571ad3e60656eadfd54c2de3398bcb905dca5b8e0a1458711dd"} Apr 22 17:34:45.503224 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.503168 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kzckr" event={"ID":"fc09482b-e40a-4db6-b3ae-ee8c7655cc83","Type":"ContainerStarted","Data":"4cff660076bcb92e3f5dd2a95cb46bead9471b6994e9282fcf2b6d58105567a6"} Apr 22 17:34:45.507521 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.507454 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" event={"ID":"8ce4426c-b706-426d-b90e-58d697c2b5b4","Type":"ContainerStarted","Data":"6d6e87422e06b795c5beefac385d884e9004eaf2113c6898fe6d04dcaf4da145"} Apr 22 17:34:45.510875 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.510672 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" event={"ID":"e451b16e-1695-40d9-83f0-1c9b48091c09","Type":"ContainerStarted","Data":"e9152e54a395fc8dfa1a3633464d8b1bea5ef231fcde4d380c836c89695772ff"} Apr 22 17:34:45.519494 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.519470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mtlqw" event={"ID":"4c5994d9-36a8-4d98-a1d9-67f743ccd167","Type":"ContainerStarted","Data":"2cee6e0999df5143af95f0fc9477c51f77f1b23b0f5b18c1884395e2952489cb"} Apr 22 17:34:45.531004 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.530868 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"5ccabeaddeb242e88fc0fbdb080ca169d25a3b2dd46b34d88ce80d540507cadf"} Apr 22 17:34:45.540319 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.540201 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerStarted","Data":"bafb4af62e3394cf421fad38ad91e8204ce64ea3ffdc9ed5859e000d26d13c08"} Apr 22 17:34:45.549718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.549577 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qkkrr" event={"ID":"bf130287-76e7-42eb-abee-d3c9dae69a49","Type":"ContainerStarted","Data":"569a7c1478276916a91b0078efaea4a87ffb57975a74ee34a72b2eea824d3cac"} Apr 22 17:34:45.560188 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:45.560161 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r7w4r" event={"ID":"963159e2-ff9f-47ac-9907-a22d03154537","Type":"ContainerStarted","Data":"41a4a346399fa22241fec8eb9552577852b66b2d188359800fab284035bb634c"} Apr 22 17:34:46.025078 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:46.025039 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:46.025265 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.025227 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:46.025323 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.025286 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:48.025268617 +0000 UTC m=+25.162071984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:46.125517 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:46.125478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:46.125685 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.125658 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:46.125736 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.125685 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:46.125736 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.125699 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:46.125869 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.125761 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:48.125739953 +0000 UTC m=+25.262543306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:46.452872 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:46.452778 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:46.453024 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.452940 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:46.453501 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:46.453351 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:46.453501 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:46.453455 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:48.040550 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:48.040511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:48.041189 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.040663 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:48.041189 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.040736 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:52.040716364 +0000 UTC m=+29.177519718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:48.141314 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:48.141273 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:48.141478 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.141444 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:48.141478 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.141464 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:48.141478 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.141475 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:48.141646 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.141543 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:34:52.141515866 +0000 UTC m=+29.278319234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:48.452636 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:48.452549 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:48.452819 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.452683 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:48.453207 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:48.453186 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:48.453327 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:48.453306 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:50.453138 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:50.453096 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:50.453559 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:50.453251 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:50.453631 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:50.453614 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:50.453759 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:50.453723 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:52.078483 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:52.077846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:52.078483 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.078031 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.078483 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.078093 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.07807503 +0000 UTC m=+37.214878387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:34:52.179011 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:52.178965 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:52.179203 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.179182 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:34:52.179272 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.179208 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:34:52.179272 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.179221 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:52.179375 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.179279 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:00.17926107 +0000 UTC m=+37.316064424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:34:52.452493 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:52.452415 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:52.452641 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.452522 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:52.452641 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:52.452592 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:52.452742 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:52.452688 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:54.453242 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:54.453205 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:54.453677 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:54.453211 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:54.453677 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:54.453313 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:54.453677 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:54.453419 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:55.453375 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:55.453341 2565 scope.go:117] "RemoveContainer" containerID="d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77" Apr 22 17:34:55.453943 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:55.453539 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_openshift-machine-config-operator(3aad222c1b8d9495d15f07d93f8a83ad)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podUID="3aad222c1b8d9495d15f07d93f8a83ad" Apr 22 17:34:56.452641 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:56.452602 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:56.452842 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:56.452616 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:56.452842 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:56.452724 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:56.452947 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:56.452844 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:34:57.236522 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.236443 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ztfbs"] Apr 22 17:34:57.240807 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.240781 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.243352 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.243331 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:34:57.243704 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.243683 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:34:57.244759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.244605 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:34:57.244759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.244610 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mbnkh\"" Apr 22 17:34:57.244759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.244639 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:34:57.244759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.244610 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:34:57.244759 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.244673 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:34:57.315055 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315020 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315228 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315062 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-textfile\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315228 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315086 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjtl\" (UniqueName: \"kubernetes.io/projected/c7f9e817-7d88-4594-954b-87973d788ce2-kube-api-access-stjtl\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315228 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-tls\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315230 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-sys\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315272 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-root\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315300 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-wtmp\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315357 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-accelerators-collector-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.315403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.315400 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-metrics-client-ca\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416212 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416171 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-metrics-client-ca\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416410 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416225 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416410 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416373 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-textfile\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416525 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stjtl\" (UniqueName: \"kubernetes.io/projected/c7f9e817-7d88-4594-954b-87973d788ce2-kube-api-access-stjtl\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416525 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-tls\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416791 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-sys\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416905 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416838 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-sys\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416905 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-root\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416905 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416859 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-metrics-client-ca\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416905 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416871 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-textfile\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.416905 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416886 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-wtmp\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.417129 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-root\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.417129 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.416944 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-accelerators-collector-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.417129 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.417002 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-wtmp\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.417977 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.417954 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-accelerators-collector-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.419356 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.419330 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-tls\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.419452 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.419391 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7f9e817-7d88-4594-954b-87973d788ce2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.427651 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.427628 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjtl\" (UniqueName: \"kubernetes.io/projected/c7f9e817-7d88-4594-954b-87973d788ce2-kube-api-access-stjtl\") pod \"node-exporter-ztfbs\" (UID: \"c7f9e817-7d88-4594-954b-87973d788ce2\") " pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:57.551024 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:57.550948 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ztfbs" Apr 22 17:34:58.452871 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:58.452835 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:34:58.452871 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:34:58.452854 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:34:58.453475 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:58.452959 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:34:58.453475 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:34:58.453104 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:00.136947 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:00.136890 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:00.137376 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.137043 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.137376 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.137119 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs podName:9b338a48-9f81-4b29-a9ed-ba400bd2e93d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.137098478 +0000 UTC m=+53.273901838 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs") pod "network-metrics-daemon-wqnlm" (UID: "9b338a48-9f81-4b29-a9ed-ba400bd2e93d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:35:00.238182 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:00.238149 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:00.238373 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.238301 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:35:00.238373 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.238323 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:35:00.238373 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.238335 2565 projected.go:194] Error preparing data for projected volume kube-api-access-ggcfv for pod openshift-network-diagnostics/network-check-target-vs8nl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.238531 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.238443 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv podName:b99a8ad0-f04e-465e-8515-c98de7f3e43d nodeName:}" failed. No retries permitted until 2026-04-22 17:35:16.2384213 +0000 UTC m=+53.375224671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggcfv" (UniqueName: "kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv") pod "network-check-target-vs8nl" (UID: "b99a8ad0-f04e-465e-8515-c98de7f3e43d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:35:00.452862 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:00.452765 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:00.452862 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:00.452787 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:00.453091 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.452906 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:00.453091 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:00.453058 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:01.226167 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:01.226138 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f9e817_7d88_4594_954b_87973d788ce2.slice/crio-d8a46ae1766bdd4f3aadacee84bd1251ede0cf9cb3dabe9151ff80d85a601683 WatchSource:0}: Error finding container d8a46ae1766bdd4f3aadacee84bd1251ede0cf9cb3dabe9151ff80d85a601683: Status 404 returned error can't find the container with id d8a46ae1766bdd4f3aadacee84bd1251ede0cf9cb3dabe9151ff80d85a601683 Apr 22 17:35:01.589954 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.589736 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" event={"ID":"8ce4426c-b706-426d-b90e-58d697c2b5b4","Type":"ContainerStarted","Data":"6afb47e0e98af82d351158e4af1803355b305b198e965da2950969a30825f0fb"} Apr 22 17:35:01.591325 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.591296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" event={"ID":"e451b16e-1695-40d9-83f0-1c9b48091c09","Type":"ContainerStarted","Data":"c2cf3c071043f3df58e35669d649ee48f40756e4ae0944f9ae07bdc3782dffc8"} Apr 22 17:35:01.593392 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.593368 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"84824f0c2261c7f02c45b1bfddeebc354cf88f94f07b19d8cfbcd9729faf230b"} Apr 22 17:35:01.594785 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.594760 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="82ccf2707a2fb447d91fc3128f58f8e0af9d1910f5f3385a7a6a07eeb66a1298" exitCode=0 Apr 22 17:35:01.595046 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.595011 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"82ccf2707a2fb447d91fc3128f58f8e0af9d1910f5f3385a7a6a07eeb66a1298"} Apr 22 17:35:01.596461 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.596437 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qkkrr" event={"ID":"bf130287-76e7-42eb-abee-d3c9dae69a49","Type":"ContainerStarted","Data":"f7fe09b52508ac5ffe48a6f5f41431cbad644c562a04ce15ab6f9a30f926533e"} Apr 22 17:35:01.597608 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.597574 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ztfbs" event={"ID":"c7f9e817-7d88-4594-954b-87973d788ce2","Type":"ContainerStarted","Data":"d8a46ae1766bdd4f3aadacee84bd1251ede0cf9cb3dabe9151ff80d85a601683"} Apr 22 17:35:01.599099 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.599039 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9s5s7" event={"ID":"78349f6c-83ce-40df-b593-af2886a0a4aa","Type":"ContainerStarted","Data":"8d612ec39427cccd192fa3a495aadae4f072f09493fe1cf38ead136e3eac8018"} Apr 22 17:35:01.611016 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.610714 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tb9hn" podStartSLOduration=1.875018049 podStartE2EDuration="18.610697419s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.635524761 +0000 UTC m=+21.772328118" lastFinishedPulling="2026-04-22 17:35:01.371204136 +0000 UTC m=+38.508007488" observedRunningTime="2026-04-22 17:35:01.609891991 +0000 UTC m=+38.746695366" watchObservedRunningTime="2026-04-22 17:35:01.610697419 +0000 UTC m=+38.747500793" Apr 22 17:35:01.631870 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.631667 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9s5s7" podStartSLOduration=2.04060075 podStartE2EDuration="18.631649434s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.644063579 +0000 UTC m=+21.780866932" lastFinishedPulling="2026-04-22 17:35:01.235112249 +0000 UTC m=+38.371915616" observedRunningTime="2026-04-22 17:35:01.631203728 +0000 UTC m=+38.768007105" watchObservedRunningTime="2026-04-22 17:35:01.631649434 +0000 UTC m=+38.768452813" Apr 22 17:35:01.647136 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:01.647084 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qkkrr" podStartSLOduration=2.144088103 podStartE2EDuration="18.647053547s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.649597729 +0000 UTC m=+21.786401082" lastFinishedPulling="2026-04-22 17:35:01.152563168 +0000 UTC m=+38.289366526" observedRunningTime="2026-04-22 17:35:01.64571215 +0000 UTC m=+38.782515532" watchObservedRunningTime="2026-04-22 17:35:01.647053547 +0000 UTC m=+38.783856922" Apr 22 17:35:02.452920 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.452895 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:02.453659 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.452901 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:02.453659 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:02.453136 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:02.453659 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:02.453020 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:02.505718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.505683 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:35:02.603527 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.603202 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mtlqw" event={"ID":"4c5994d9-36a8-4d98-a1d9-67f743ccd167","Type":"ContainerStarted","Data":"538df0db2ced1d198465eedfa67b83c155cf868bb635ed54d80da492341c4baf"} Apr 22 17:35:02.606229 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.606198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"88c2b436eecb29b1a08693c5c12a81146a4c18b08f49da28b983fd03edb3f7de"} Apr 22 17:35:02.606363 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.606235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"44a2057c1dd41be95571c855a7af2aa64fe35c44595391a589ab6f3cbf95e55d"} Apr 22 17:35:02.606363 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.606250 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"370a4a94f0045c08d0148b251ab05264a5c08a68fda3f7d7510d736d3e4ee0db"} Apr 22 17:35:02.606363 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.606261 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"acd206ebccfc7dc9016a17838d02e58f2b340e4c7e9ceacb5a4a24ae5669efc6"} Apr 22 17:35:02.606363 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.606273 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"ee06b122aba8f159c198e5172541871b9d32aa602d1cb1c7e6d6015ecfd351c7"} Apr 22 17:35:02.607705 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.607680 2565 generic.go:358] "Generic (PLEG): container finished" podID="c7f9e817-7d88-4594-954b-87973d788ce2" containerID="ceaa465860ccc57d1b2e947b8d597745f28e9aabd24dc2cae2ed9c7698b44a86" exitCode=0 Apr 22 17:35:02.607837 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.607711 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ztfbs" event={"ID":"c7f9e817-7d88-4594-954b-87973d788ce2","Type":"ContainerDied","Data":"ceaa465860ccc57d1b2e947b8d597745f28e9aabd24dc2cae2ed9c7698b44a86"} Apr 22 17:35:02.609437 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.609415 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kzckr" event={"ID":"fc09482b-e40a-4db6-b3ae-ee8c7655cc83","Type":"ContainerStarted","Data":"0a9f84a91eb16acad8dfb3d0a1fdbe2016a979044bb2873717a021abf6e813e5"} Apr 22 17:35:02.611156 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.611118 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" event={"ID":"e451b16e-1695-40d9-83f0-1c9b48091c09","Type":"ContainerStarted","Data":"d788ec910ec72dcc38577c86cb8d5602349c209e7f441788065793c5aaf82291"} Apr 22 17:35:02.632941 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.632885 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kzckr" podStartSLOduration=3.047118863 podStartE2EDuration="19.632868589s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.641030052 +0000 UTC m=+21.777833416" lastFinishedPulling="2026-04-22 17:35:01.226779774 +0000 UTC m=+38.363583142" observedRunningTime="2026-04-22 17:35:02.632830423 +0000 UTC m=+39.769633799" watchObservedRunningTime="2026-04-22 17:35:02.632868589 +0000 UTC m=+39.769671964" Apr 22 17:35:02.633471 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:02.633443 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mtlqw" podStartSLOduration=3.03772534 podStartE2EDuration="19.633434234s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.624932622 +0000 UTC m=+21.761735989" lastFinishedPulling="2026-04-22 17:35:01.22064153 +0000 UTC m=+38.357444883" observedRunningTime="2026-04-22 17:35:02.618584841 +0000 UTC m=+39.755388218" watchObservedRunningTime="2026-04-22 17:35:02.633434234 +0000 UTC m=+39.770237608" Apr 22 17:35:03.380660 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.380548 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:35:02.505702892Z","UUID":"44fb429f-9f84-4fc6-86b1-d3cc0869a308","Handler":null,"Name":"","Endpoint":""} Apr 22 17:35:03.384225 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.384201 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:35:03.384365 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.384232 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:35:03.614242 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.614202 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r7w4r" event={"ID":"963159e2-ff9f-47ac-9907-a22d03154537","Type":"ContainerStarted","Data":"f588603b376203fadd3c611b4674fb6dfdd90d7a7b24d9c320a4d04178b82553"} Apr 22 17:35:03.616558 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.616532 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ztfbs" event={"ID":"c7f9e817-7d88-4594-954b-87973d788ce2","Type":"ContainerStarted","Data":"03360f2b5e38823023e7be5217c9001ad97f90b60f4a686b763b2f4376bf3b47"} Apr 22 17:35:03.616687 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.616564 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ztfbs" event={"ID":"c7f9e817-7d88-4594-954b-87973d788ce2","Type":"ContainerStarted","Data":"18ac5aa63cec55cd5c74d8e61eb516779831e95f17d410586e24e2c81e8536a5"} Apr 22 17:35:03.618743 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.618686 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" event={"ID":"e451b16e-1695-40d9-83f0-1c9b48091c09","Type":"ContainerStarted","Data":"61d1088e2c486adc5fdec0fef7016228d5d94faf1aeba6419a10f7f3024261be"} Apr 22 17:35:03.628784 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.628736 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r7w4r" podStartSLOduration=4.023741116 podStartE2EDuration="20.628718894s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.615657439 +0000 UTC m=+21.752460792" lastFinishedPulling="2026-04-22 17:35:01.220635203 +0000 UTC m=+38.357438570" observedRunningTime="2026-04-22 17:35:03.628172443 +0000 UTC m=+40.764975809" watchObservedRunningTime="2026-04-22 17:35:03.628718894 +0000 UTC m=+40.765522271" Apr 22 17:35:03.645021 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.644974 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ztfbs" podStartSLOduration=5.551315367 podStartE2EDuration="6.644961544s" podCreationTimestamp="2026-04-22 17:34:57 +0000 UTC" firstStartedPulling="2026-04-22 17:35:01.227982066 +0000 UTC m=+38.364785425" lastFinishedPulling="2026-04-22 17:35:02.321628231 +0000 UTC m=+39.458431602" observedRunningTime="2026-04-22 17:35:03.644779845 +0000 UTC m=+40.781583222" watchObservedRunningTime="2026-04-22 17:35:03.644961544 +0000 UTC m=+40.781764924" Apr 22 17:35:03.662056 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:03.662008 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rbvw6" podStartSLOduration=1.8909756450000001 podStartE2EDuration="20.6619971s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.632763551 +0000 UTC m=+21.769566904" lastFinishedPulling="2026-04-22 17:35:03.403784992 +0000 UTC m=+40.540588359" observedRunningTime="2026-04-22 17:35:03.661985943 +0000 UTC m=+40.798789317" watchObservedRunningTime="2026-04-22 17:35:03.6619971 +0000 UTC m=+40.798800478" Apr 22 17:35:04.452495 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:04.452447 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:04.452659 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:04.452448 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:04.452659 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:04.452577 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:04.452659 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:04.452651 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:04.615866 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:04.615826 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:35:04.616551 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:04.616532 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:35:04.624279 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:04.624242 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"ab719cab05579a838d85eb47cb3f2d7569a9e55b7e847e6d3793ed8db64f67f9"} Apr 22 17:35:05.626017 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:05.625932 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:35:06.452283 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.452250 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:06.452283 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.452287 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:06.452489 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:06.452371 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:06.452529 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:06.452491 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:06.633318 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.631051 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" event={"ID":"0a131c7a-9e60-4af2-9909-c14202f6723e","Type":"ContainerStarted","Data":"0cfa39f6c2bada5846f9acbbb4f14038f0a7abca4cb796dfb86eef648f2a90c1"} Apr 22 17:35:06.633318 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.632893 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:06.633318 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.632930 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:06.633318 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.632952 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:06.635428 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.635399 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="c6de4cea8c6e676d1acf244d86da99e98f733b89db26d3eefa7f010f8ca71938" exitCode=0 Apr 22 17:35:06.635548 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.635455 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"c6de4cea8c6e676d1acf244d86da99e98f733b89db26d3eefa7f010f8ca71938"} Apr 22 17:35:06.648437 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.648410 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:06.648558 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.648482 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:06.662081 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:06.662028 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" podStartSLOduration=6.840736651 podStartE2EDuration="23.662014681s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.611393354 +0000 UTC m=+21.748196707" lastFinishedPulling="2026-04-22 17:35:01.432671367 +0000 UTC m=+38.569474737" observedRunningTime="2026-04-22 17:35:06.661999133 +0000 UTC m=+43.798802508" watchObservedRunningTime="2026-04-22 17:35:06.662014681 +0000 UTC m=+43.798818038" Apr 22 17:35:07.830878 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:07.830787 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:35:07.831421 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:07.830944 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:35:07.831754 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:07.831728 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mtlqw" Apr 22 17:35:08.027640 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.027564 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vs8nl"] Apr 22 17:35:08.027774 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.027694 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:08.027840 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:08.027781 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:08.029615 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.029589 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wqnlm"] Apr 22 17:35:08.029736 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.029712 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:08.029856 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:08.029838 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:08.452620 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.452575 2565 scope.go:117] "RemoveContainer" containerID="d02c296357a1b2e9c16211e24b0117aca2e18aa60e3565e478c01bbf53a0bb77" Apr 22 17:35:08.640244 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.640216 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:35:08.640606 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.640581 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" event={"ID":"3aad222c1b8d9495d15f07d93f8a83ad","Type":"ContainerStarted","Data":"563dfb61ef34d8d9780a9400168e542897a18d61cef534a1e3d43f1f45d7044d"} Apr 22 17:35:08.642378 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.642355 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="5b85253461ef7547b893a3e17caf9f5b8fa4caca9981e6c760b804fff46501b4" exitCode=0 Apr 22 17:35:08.642485 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.642443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"5b85253461ef7547b893a3e17caf9f5b8fa4caca9981e6c760b804fff46501b4"} Apr 22 17:35:08.654975 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:08.654936 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal" podStartSLOduration=24.654924312 podStartE2EDuration="24.654924312s" podCreationTimestamp="2026-04-22 17:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:35:08.654699018 +0000 UTC m=+45.791502394" watchObservedRunningTime="2026-04-22 17:35:08.654924312 +0000 UTC m=+45.791727687" Apr 22 17:35:09.452958 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:09.452923 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:09.453473 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:09.452971 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:09.453473 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:09.453040 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:09.453473 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:09.453201 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:10.648727 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:10.648696 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="4621513b9af76239bb9f4272d1ecb4995090fc8b4bfc53b9fde232abffd66700" exitCode=0 Apr 22 17:35:10.649144 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:10.648738 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"4621513b9af76239bb9f4272d1ecb4995090fc8b4bfc53b9fde232abffd66700"} Apr 22 17:35:11.452597 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:11.452560 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:11.452786 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:11.452560 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:11.452786 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:11.452686 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:11.452786 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:11.452764 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:13.453849 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:13.453791 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:13.454347 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:13.453867 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:13.454347 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:13.453922 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wqnlm" podUID="9b338a48-9f81-4b29-a9ed-ba400bd2e93d" Apr 22 17:35:13.454347 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:35:13.453980 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vs8nl" podUID="b99a8ad0-f04e-465e-8515-c98de7f3e43d" Apr 22 17:35:14.193901 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.193670 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-38.ec2.internal" event="NodeReady" Apr 22 17:35:14.194082 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.194053 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:35:14.259477 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.258971 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vngqt"] Apr 22 17:35:14.277343 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.277313 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p466n"] Apr 22 17:35:14.277516 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.277459 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.280957 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.280930 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:35:14.281644 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.281460 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-k4t5h\"" Apr 22 17:35:14.281756 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.281693 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:35:14.283018 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.282995 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:35:14.292749 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.292726 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vngqt"] Apr 22 17:35:14.292872 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.292754 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p466n"] Apr 22 17:35:14.292934 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.292886 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.296021 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.296002 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:35:14.296322 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.296305 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:35:14.296535 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.296520 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ghfnt\"" Apr 22 17:35:14.347405 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.347369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpb2s\" (UniqueName: \"kubernetes.io/projected/f25f12d8-df40-42af-a646-0c95f9e24e70-kube-api-access-bpb2s\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.347405 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.347416 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25f12d8-df40-42af-a646-0c95f9e24e70-cert\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.382816 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.382761 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gxzmr"] Apr 22 17:35:14.412585 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.412552 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gxzmr"] Apr 22 17:35:14.412750 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.412689 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.422128 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.422101 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:35:14.422261 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.422157 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:35:14.422261 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.422164 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:35:14.422261 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.422102 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-mtj7t\"" Apr 22 17:35:14.422261 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.422101 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:35:14.448608 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448504 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25f12d8-df40-42af-a646-0c95f9e24e70-cert\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.448608 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448557 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbpxb\" (UniqueName: \"kubernetes.io/projected/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-kube-api-access-jbpxb\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.448870 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448672 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-config-volume\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.448870 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448698 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-metrics-tls\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.448870 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-tmp-dir\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.448870 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.448851 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpb2s\" (UniqueName: \"kubernetes.io/projected/f25f12d8-df40-42af-a646-0c95f9e24e70-kube-api-access-bpb2s\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.454267 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.454235 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f25f12d8-df40-42af-a646-0c95f9e24e70-cert\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.458244 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.458012 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpb2s\" (UniqueName: \"kubernetes.io/projected/f25f12d8-df40-42af-a646-0c95f9e24e70-kube-api-access-bpb2s\") pod \"ingress-canary-vngqt\" (UID: \"f25f12d8-df40-42af-a646-0c95f9e24e70\") " pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.549705 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/facfd808-28c1-41e8-9141-4dbe2c1b8c77-data-volume\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549725 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-tmp-dir\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549770 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbpxb\" (UniqueName: \"kubernetes.io/projected/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-kube-api-access-jbpxb\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46r4n\" (UniqueName: \"kubernetes.io/projected/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-api-access-46r4n\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549848 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-config-volume\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549872 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-metrics-tls\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.549911 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549898 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.550225 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.549930 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/facfd808-28c1-41e8-9141-4dbe2c1b8c77-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.550225 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.550013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/facfd808-28c1-41e8-9141-4dbe2c1b8c77-crio-socket\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.550225 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.550081 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-tmp-dir\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.550387 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.550370 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-config-volume\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.552393 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.552371 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-metrics-tls\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.563678 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.563654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbpxb\" (UniqueName: \"kubernetes.io/projected/522cdf45-6f50-4e49-9db4-3eef0a5bd8cc-kube-api-access-jbpxb\") pod \"dns-default-p466n\" (UID: \"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc\") " pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.590634 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.590604 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vngqt" Apr 22 17:35:14.602331 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.602308 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p466n" Apr 22 17:35:14.651156 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651123 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/facfd808-28c1-41e8-9141-4dbe2c1b8c77-data-volume\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651337 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651173 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46r4n\" (UniqueName: \"kubernetes.io/projected/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-api-access-46r4n\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651337 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651214 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651337 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651243 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/facfd808-28c1-41e8-9141-4dbe2c1b8c77-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651337 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/facfd808-28c1-41e8-9141-4dbe2c1b8c77-crio-socket\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651509 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651490 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/facfd808-28c1-41e8-9141-4dbe2c1b8c77-data-volume\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651564 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651547 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/facfd808-28c1-41e8-9141-4dbe2c1b8c77-crio-socket\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.651825 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.651789 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.654092 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.654053 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/facfd808-28c1-41e8-9141-4dbe2c1b8c77-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.669576 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.669539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46r4n\" (UniqueName: \"kubernetes.io/projected/facfd808-28c1-41e8-9141-4dbe2c1b8c77-kube-api-access-46r4n\") pod \"insights-runtime-extractor-gxzmr\" (UID: \"facfd808-28c1-41e8-9141-4dbe2c1b8c77\") " pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:14.722653 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:14.722578 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gxzmr" Apr 22 17:35:15.452306 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.452273 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:15.452489 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.452473 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:15.455706 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.455686 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2ldpn\"" Apr 22 17:35:15.455706 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.455695 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:35:15.455706 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.455702 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jqt69\"" Apr 22 17:35:15.456201 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.455691 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:35:15.456201 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:15.455936 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:35:16.162120 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.162086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:16.164243 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.164223 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b338a48-9f81-4b29-a9ed-ba400bd2e93d-metrics-certs\") pod \"network-metrics-daemon-wqnlm\" (UID: \"9b338a48-9f81-4b29-a9ed-ba400bd2e93d\") " pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:16.263983 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.263319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:16.269403 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.269345 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcfv\" (UniqueName: \"kubernetes.io/projected/b99a8ad0-f04e-465e-8515-c98de7f3e43d-kube-api-access-ggcfv\") pod \"network-check-target-vs8nl\" (UID: \"b99a8ad0-f04e-465e-8515-c98de7f3e43d\") " pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:16.338261 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.338196 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gxzmr"] Apr 22 17:35:16.339125 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.339052 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vngqt"] Apr 22 17:35:16.342418 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.342397 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p466n"] Apr 22 17:35:16.368303 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.368282 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:16.374295 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.374275 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wqnlm" Apr 22 17:35:16.409104 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:16.409078 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacfd808_28c1_41e8_9141_4dbe2c1b8c77.slice/crio-deb3a78722a52edc32ab5fe7f995208c89d929873ea7e213438c994f2f719e62 WatchSource:0}: Error finding container deb3a78722a52edc32ab5fe7f995208c89d929873ea7e213438c994f2f719e62: Status 404 returned error can't find the container with id deb3a78722a52edc32ab5fe7f995208c89d929873ea7e213438c994f2f719e62 Apr 22 17:35:16.409887 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:16.409856 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf25f12d8_df40_42af_a646_0c95f9e24e70.slice/crio-0a39dd36266ce1b32add641bcf09c2a74ad920d45e2acae62d79d8ac7b8faa06 WatchSource:0}: Error finding container 0a39dd36266ce1b32add641bcf09c2a74ad920d45e2acae62d79d8ac7b8faa06: Status 404 returned error can't find the container with id 0a39dd36266ce1b32add641bcf09c2a74ad920d45e2acae62d79d8ac7b8faa06 Apr 22 17:35:16.410437 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:16.410414 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522cdf45_6f50_4e49_9db4_3eef0a5bd8cc.slice/crio-3ae6cb9e7812e6148d0f919325c0c59a74455af33ff9e0c734c70d2640eac7a1 WatchSource:0}: Error finding container 3ae6cb9e7812e6148d0f919325c0c59a74455af33ff9e0c734c70d2640eac7a1: Status 404 returned error can't find the container with id 3ae6cb9e7812e6148d0f919325c0c59a74455af33ff9e0c734c70d2640eac7a1 Apr 22 17:35:16.600435 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.600404 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vs8nl"] Apr 22 17:35:16.609031 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:16.609005 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99a8ad0_f04e_465e_8515_c98de7f3e43d.slice/crio-cab9ba26899892cb522652c5cd62416d9c93a380a99de8bf0241e4e1cdcf4255 WatchSource:0}: Error finding container cab9ba26899892cb522652c5cd62416d9c93a380a99de8bf0241e4e1cdcf4255: Status 404 returned error can't find the container with id cab9ba26899892cb522652c5cd62416d9c93a380a99de8bf0241e4e1cdcf4255 Apr 22 17:35:16.615773 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.615727 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wqnlm"] Apr 22 17:35:16.619460 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:16.619434 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b338a48_9f81_4b29_a9ed_ba400bd2e93d.slice/crio-41354e252deca14e72c1792b7c8ac1d21fc0201733a3c12b98a2553cf0f5628c WatchSource:0}: Error finding container 41354e252deca14e72c1792b7c8ac1d21fc0201733a3c12b98a2553cf0f5628c: Status 404 returned error can't find the container with id 41354e252deca14e72c1792b7c8ac1d21fc0201733a3c12b98a2553cf0f5628c Apr 22 17:35:16.661149 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.661112 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vs8nl" event={"ID":"b99a8ad0-f04e-465e-8515-c98de7f3e43d","Type":"ContainerStarted","Data":"cab9ba26899892cb522652c5cd62416d9c93a380a99de8bf0241e4e1cdcf4255"} Apr 22 17:35:16.662290 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.662260 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p466n" event={"ID":"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc","Type":"ContainerStarted","Data":"3ae6cb9e7812e6148d0f919325c0c59a74455af33ff9e0c734c70d2640eac7a1"} Apr 22 17:35:16.663478 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.663448 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqnlm" event={"ID":"9b338a48-9f81-4b29-a9ed-ba400bd2e93d","Type":"ContainerStarted","Data":"41354e252deca14e72c1792b7c8ac1d21fc0201733a3c12b98a2553cf0f5628c"} Apr 22 17:35:16.664939 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.664917 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gxzmr" event={"ID":"facfd808-28c1-41e8-9141-4dbe2c1b8c77","Type":"ContainerStarted","Data":"4f81113b56eeaf0433baadfee27010a672386f517857cb840e07e2df08fe451c"} Apr 22 17:35:16.665037 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.664943 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gxzmr" event={"ID":"facfd808-28c1-41e8-9141-4dbe2c1b8c77","Type":"ContainerStarted","Data":"deb3a78722a52edc32ab5fe7f995208c89d929873ea7e213438c994f2f719e62"} Apr 22 17:35:16.666014 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.665975 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vngqt" event={"ID":"f25f12d8-df40-42af-a646-0c95f9e24e70","Type":"ContainerStarted","Data":"0a39dd36266ce1b32add641bcf09c2a74ad920d45e2acae62d79d8ac7b8faa06"} Apr 22 17:35:16.668577 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:16.668557 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerStarted","Data":"7b77c5d31829020d627a04154b0483e74857282600e8eb52c0067a45365af4f5"} Apr 22 17:35:17.676958 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:17.676919 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="7b77c5d31829020d627a04154b0483e74857282600e8eb52c0067a45365af4f5" exitCode=0 Apr 22 17:35:17.677401 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:17.676993 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"7b77c5d31829020d627a04154b0483e74857282600e8eb52c0067a45365af4f5"} Apr 22 17:35:18.682607 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:18.682568 2565 generic.go:358] "Generic (PLEG): container finished" podID="7df0412a-fc05-4359-8ed3-f1f7cc48a8eb" containerID="0beb663092e3a4fa71181962422401e5e7e27b641bff9c134312e95954320396" exitCode=0 Apr 22 17:35:18.683065 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:18.682608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerDied","Data":"0beb663092e3a4fa71181962422401e5e7e27b641bff9c134312e95954320396"} Apr 22 17:35:21.690074 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.690037 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vs8nl" event={"ID":"b99a8ad0-f04e-465e-8515-c98de7f3e43d","Type":"ContainerStarted","Data":"dd322e26f3437add5a2b0b11d7f0a311c59ed78bf738b97d8fb239c0d1cde04a"} Apr 22 17:35:21.690557 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.690160 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:21.691569 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.691546 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p466n" event={"ID":"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc","Type":"ContainerStarted","Data":"5396c99475103f38ab00f72e19b9dab38b8747e424b5127e7f5739fa7b81d57b"} Apr 22 17:35:21.691683 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.691575 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p466n" event={"ID":"522cdf45-6f50-4e49-9db4-3eef0a5bd8cc","Type":"ContainerStarted","Data":"baaae5c83afae7c0d2c49603707cad1ca3e7345e68b98548075621c5c2b14ee8"} Apr 22 17:35:21.691743 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.691685 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p466n" Apr 22 17:35:21.693094 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.693073 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqnlm" event={"ID":"9b338a48-9f81-4b29-a9ed-ba400bd2e93d","Type":"ContainerStarted","Data":"b9a7d184093ccb3f0fcac776ca48e6c7a1dc073ebf3812c967619e325d93a291"} Apr 22 17:35:21.693094 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.693098 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wqnlm" event={"ID":"9b338a48-9f81-4b29-a9ed-ba400bd2e93d","Type":"ContainerStarted","Data":"f4b1653b495845e598c5fdaeb3bb55d5939e157526d1b0622ef204c297374fb7"} Apr 22 17:35:21.694578 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.694556 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gxzmr" event={"ID":"facfd808-28c1-41e8-9141-4dbe2c1b8c77","Type":"ContainerStarted","Data":"4c7e699dde9b62cd5d33700b9f5a3a0414778f75c811896e633aa75b97c3a42c"} Apr 22 17:35:21.695666 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.695648 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vngqt" event={"ID":"f25f12d8-df40-42af-a646-0c95f9e24e70","Type":"ContainerStarted","Data":"dbd3051f2b7c7eb853a4566ea0dff57ed84d7d82d2c9279323c6d9925ca519ad"} Apr 22 17:35:21.698237 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.698219 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" event={"ID":"7df0412a-fc05-4359-8ed3-f1f7cc48a8eb","Type":"ContainerStarted","Data":"43df8950496f6144a287e993b16d0fab4be0d1e887a7eabdae09be4c3834b399"} Apr 22 17:35:21.706644 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.706607 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vs8nl" podStartSLOduration=34.281095158 podStartE2EDuration="38.706596484s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:35:16.610901637 +0000 UTC m=+53.747704990" lastFinishedPulling="2026-04-22 17:35:21.036402947 +0000 UTC m=+58.173206316" observedRunningTime="2026-04-22 17:35:21.704967682 +0000 UTC m=+58.841771056" watchObservedRunningTime="2026-04-22 17:35:21.706596484 +0000 UTC m=+58.843399860" Apr 22 17:35:21.721237 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.721191 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wqnlm" podStartSLOduration=34.316886135 podStartE2EDuration="38.721178776s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:35:16.621543877 +0000 UTC m=+53.758347230" lastFinishedPulling="2026-04-22 17:35:21.025836514 +0000 UTC m=+58.162639871" observedRunningTime="2026-04-22 17:35:21.720891917 +0000 UTC m=+58.857695292" watchObservedRunningTime="2026-04-22 17:35:21.721178776 +0000 UTC m=+58.857982187" Apr 22 17:35:21.742852 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.742693 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k2j7l" podStartSLOduration=6.939812098 podStartE2EDuration="38.742675401s" podCreationTimestamp="2026-04-22 17:34:43 +0000 UTC" firstStartedPulling="2026-04-22 17:34:44.65257905 +0000 UTC m=+21.789382404" lastFinishedPulling="2026-04-22 17:35:16.455442342 +0000 UTC m=+53.592245707" observedRunningTime="2026-04-22 17:35:21.740944044 +0000 UTC m=+58.877747418" watchObservedRunningTime="2026-04-22 17:35:21.742675401 +0000 UTC m=+58.879478780" Apr 22 17:35:21.756018 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.755973 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vngqt" podStartSLOduration=3.158779587 podStartE2EDuration="7.755959554s" podCreationTimestamp="2026-04-22 17:35:14 +0000 UTC" firstStartedPulling="2026-04-22 17:35:16.428712583 +0000 UTC m=+53.565515941" lastFinishedPulling="2026-04-22 17:35:21.025892552 +0000 UTC m=+58.162695908" observedRunningTime="2026-04-22 17:35:21.755329041 +0000 UTC m=+58.892132417" watchObservedRunningTime="2026-04-22 17:35:21.755959554 +0000 UTC m=+58.892762928" Apr 22 17:35:21.772292 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:21.772229 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p466n" podStartSLOduration=3.175418441 podStartE2EDuration="7.772210791s" podCreationTimestamp="2026-04-22 17:35:14 +0000 UTC" firstStartedPulling="2026-04-22 17:35:16.428743114 +0000 UTC m=+53.565546472" lastFinishedPulling="2026-04-22 17:35:21.025535457 +0000 UTC m=+58.162338822" observedRunningTime="2026-04-22 17:35:21.770877426 +0000 UTC m=+58.907680801" watchObservedRunningTime="2026-04-22 17:35:21.772210791 +0000 UTC m=+58.909014169" Apr 22 17:35:22.615178 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:22.615151 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vngqt_f25f12d8-df40-42af-a646-0c95f9e24e70/serve-healthcheck-canary/0.log" Apr 22 17:35:23.705361 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:23.705324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gxzmr" event={"ID":"facfd808-28c1-41e8-9141-4dbe2c1b8c77","Type":"ContainerStarted","Data":"c81c5857d5a26370ad0eda9510acdf699cd174c96502a7e6fafccf8a3341c84b"} Apr 22 17:35:23.724299 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:23.724250 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gxzmr" podStartSLOduration=3.042893086 podStartE2EDuration="9.724234023s" podCreationTimestamp="2026-04-22 17:35:14 +0000 UTC" firstStartedPulling="2026-04-22 17:35:16.577396284 +0000 UTC m=+53.714199656" lastFinishedPulling="2026-04-22 17:35:23.258737237 +0000 UTC m=+60.395540593" observedRunningTime="2026-04-22 17:35:23.723223304 +0000 UTC m=+60.860026678" watchObservedRunningTime="2026-04-22 17:35:23.724234023 +0000 UTC m=+60.861037413" Apr 22 17:35:31.704057 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:31.704022 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p466n" Apr 22 17:35:38.662590 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:38.662558 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t45c7" Apr 22 17:35:51.513538 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.513397 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-67869777f8-bx6q4"] Apr 22 17:35:51.552872 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.552840 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-67869777f8-bx6q4"] Apr 22 17:35:51.553009 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.552991 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.555212 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555189 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:35:51.555459 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555442 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:35:51.555524 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555467 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:35:51.555581 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555551 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:35:51.555690 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555675 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j8cgl\"" Apr 22 17:35:51.555891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.555877 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:35:51.560216 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.560097 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:35:51.697524 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697487 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-federate-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697524 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697526 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-metrics-client-ca\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697550 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkxs\" (UniqueName: \"kubernetes.io/projected/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-kube-api-access-5pkxs\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697602 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697630 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-serving-certs-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697700 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.697739 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.697721 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.798764 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798674 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.798764 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798723 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.798764 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798751 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-federate-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799069 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798772 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-metrics-client-ca\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799069 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkxs\" (UniqueName: \"kubernetes.io/projected/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-kube-api-access-5pkxs\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799069 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799069 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799069 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.798912 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-serving-certs-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799586 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.799561 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-metrics-client-ca\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.799752 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.799651 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-serving-certs-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.800669 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.800631 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.813980 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.813951 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.813980 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.813970 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.814137 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.814018 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-telemeter-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.814137 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.814056 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-federate-client-tls\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.815622 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.815605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkxs\" (UniqueName: \"kubernetes.io/projected/14b27c34-ba3c-43e8-afb8-4257a7f3a2b9-kube-api-access-5pkxs\") pod \"telemeter-client-67869777f8-bx6q4\" (UID: \"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9\") " pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.862408 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.862368 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" Apr 22 17:35:51.990720 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:51.990684 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-67869777f8-bx6q4"] Apr 22 17:35:51.995441 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:35:51.995396 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b27c34_ba3c_43e8_afb8_4257a7f3a2b9.slice/crio-54a3d99671d88c10fbed2ed5ec3ef9f77f80595ed42d12e795e5710873dd5d82 WatchSource:0}: Error finding container 54a3d99671d88c10fbed2ed5ec3ef9f77f80595ed42d12e795e5710873dd5d82: Status 404 returned error can't find the container with id 54a3d99671d88c10fbed2ed5ec3ef9f77f80595ed42d12e795e5710873dd5d82 Apr 22 17:35:52.704224 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:52.704197 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vs8nl" Apr 22 17:35:52.785107 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:52.785071 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" event={"ID":"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9","Type":"ContainerStarted","Data":"54a3d99671d88c10fbed2ed5ec3ef9f77f80595ed42d12e795e5710873dd5d82"} Apr 22 17:35:54.792475 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:54.792433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" event={"ID":"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9","Type":"ContainerStarted","Data":"f9b7942eefa83cf34f0377a39dbf07074257c53ec5e769b0ad6563f70e2deb29"} Apr 22 17:35:55.796891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:55.796857 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" event={"ID":"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9","Type":"ContainerStarted","Data":"a8d6df1937c84b162ef703b6fb1b6c24860eea74178ba694d594633fad8d491b"} Apr 22 17:35:55.796891 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:55.796894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" event={"ID":"14b27c34-ba3c-43e8-afb8-4257a7f3a2b9","Type":"ContainerStarted","Data":"d6fbeac0358eff565faff496b936fe3843f94ab5b80dfdfd771cbb8f29a8f0a1"} Apr 22 17:35:55.819651 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:35:55.819465 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-67869777f8-bx6q4" podStartSLOduration=1.646608713 podStartE2EDuration="4.819445876s" podCreationTimestamp="2026-04-22 17:35:51 +0000 UTC" firstStartedPulling="2026-04-22 17:35:51.997180322 +0000 UTC m=+89.133983674" lastFinishedPulling="2026-04-22 17:35:55.170017485 +0000 UTC m=+92.306820837" observedRunningTime="2026-04-22 17:35:55.818236196 +0000 UTC m=+92.955039571" watchObservedRunningTime="2026-04-22 17:35:55.819445876 +0000 UTC m=+92.956249253" Apr 22 17:37:50.843161 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.843121 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xxwlx"] Apr 22 17:37:50.845912 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.845894 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:50.847932 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.847910 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:37:50.853201 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.853178 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xxwlx"] Apr 22 17:37:50.975089 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.975050 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65112848-ca51-45d5-bc0f-36033c7a2b83-original-pull-secret\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:50.975089 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.975095 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-dbus\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:50.975309 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:50.975190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-kubelet-config\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.075813 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.075776 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-kubelet-config\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.075947 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.075845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65112848-ca51-45d5-bc0f-36033c7a2b83-original-pull-secret\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.075947 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.075880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-dbus\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.075947 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.075907 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-kubelet-config\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.076119 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.076029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65112848-ca51-45d5-bc0f-36033c7a2b83-dbus\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.078181 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.078155 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65112848-ca51-45d5-bc0f-36033c7a2b83-original-pull-secret\") pod \"global-pull-secret-syncer-xxwlx\" (UID: \"65112848-ca51-45d5-bc0f-36033c7a2b83\") " pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.154728 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.154698 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xxwlx" Apr 22 17:37:51.265839 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:51.265788 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xxwlx"] Apr 22 17:37:51.270380 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:37:51.270354 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65112848_ca51_45d5_bc0f_36033c7a2b83.slice/crio-9ee2fbfc0de858fa37f1df8f3e3a1020e17d984719d5a3a3884b8708d2ac99e1 WatchSource:0}: Error finding container 9ee2fbfc0de858fa37f1df8f3e3a1020e17d984719d5a3a3884b8708d2ac99e1: Status 404 returned error can't find the container with id 9ee2fbfc0de858fa37f1df8f3e3a1020e17d984719d5a3a3884b8708d2ac99e1 Apr 22 17:37:52.078784 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:52.078741 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xxwlx" event={"ID":"65112848-ca51-45d5-bc0f-36033c7a2b83","Type":"ContainerStarted","Data":"9ee2fbfc0de858fa37f1df8f3e3a1020e17d984719d5a3a3884b8708d2ac99e1"} Apr 22 17:37:56.090146 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:56.090108 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xxwlx" event={"ID":"65112848-ca51-45d5-bc0f-36033c7a2b83","Type":"ContainerStarted","Data":"62e8b54fe7d041256a4b583b45bf4b712de8340f6cbf3895f9778171629fd22c"} Apr 22 17:37:56.110533 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:37:56.110483 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xxwlx" podStartSLOduration=2.333401645 podStartE2EDuration="6.110464999s" podCreationTimestamp="2026-04-22 17:37:50 +0000 UTC" firstStartedPulling="2026-04-22 17:37:51.271860443 +0000 UTC m=+208.408663797" lastFinishedPulling="2026-04-22 17:37:55.048923796 +0000 UTC m=+212.185727151" observedRunningTime="2026-04-22 17:37:56.108887372 +0000 UTC m=+213.245690748" watchObservedRunningTime="2026-04-22 17:37:56.110464999 +0000 UTC m=+213.247268378" Apr 22 17:38:03.084985 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.084953 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz"] Apr 22 17:38:03.088882 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.088864 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.092470 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.092449 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:38:03.093402 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.093379 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:38:03.093402 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.093382 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:38:03.093557 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.093447 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:38:03.109923 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.109893 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz"] Apr 22 17:38:03.155514 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.155480 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.155514 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.155515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frnpr\" (UniqueName: \"kubernetes.io/projected/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-kube-api-access-frnpr\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.186126 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.186095 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9"] Apr 22 17:38:03.190357 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.190339 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.192743 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.192724 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:38:03.199003 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.198984 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t"] Apr 22 17:38:03.203576 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.203559 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9"] Apr 22 17:38:03.203669 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.203658 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.206244 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.206228 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 17:38:03.206725 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.206706 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 17:38:03.206866 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.206737 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 17:38:03.206866 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.206708 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 17:38:03.214289 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.214267 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t"] Apr 22 17:38:03.256505 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.256474 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a890df3b-4f93-45f6-a741-00411e68033c-tmp\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.256672 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.256516 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.256672 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.256545 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frnpr\" (UniqueName: \"kubernetes.io/projected/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-kube-api-access-frnpr\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.256672 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.256590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a890df3b-4f93-45f6-a741-00411e68033c-klusterlet-config\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.256672 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.256616 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn87p\" (UniqueName: \"kubernetes.io/projected/a890df3b-4f93-45f6-a741-00411e68033c-kube-api-access-fn87p\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.258966 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.258940 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.265010 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.264990 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frnpr\" (UniqueName: \"kubernetes.io/projected/6584a8e3-ddf4-4c36-85d4-dd40cf4f26af-kube-api-access-frnpr\") pod \"managed-serviceaccount-addon-agent-784b7d68df-r6lgz\" (UID: \"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.356991 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.356881 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.356991 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.356946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a890df3b-4f93-45f6-a741-00411e68033c-klusterlet-config\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.356991 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.356977 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn87p\" (UniqueName: \"kubernetes.io/projected/a890df3b-4f93-45f6-a741-00411e68033c-kube-api-access-fn87p\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.356991 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357001 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.357326 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357023 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a890df3b-4f93-45f6-a741-00411e68033c-tmp\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.357326 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357208 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.357326 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357255 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.357326 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.357482 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357332 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kl8p\" (UniqueName: \"kubernetes.io/projected/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-kube-api-access-6kl8p\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.357482 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.357330 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a890df3b-4f93-45f6-a741-00411e68033c-tmp\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.359459 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.359442 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a890df3b-4f93-45f6-a741-00411e68033c-klusterlet-config\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.365737 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.365703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn87p\" (UniqueName: \"kubernetes.io/projected/a890df3b-4f93-45f6-a741-00411e68033c-kube-api-access-fn87p\") pod \"klusterlet-addon-workmgr-669cb4f99d-59zd9\" (UID: \"a890df3b-4f93-45f6-a741-00411e68033c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.405836 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.405793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" Apr 22 17:38:03.458671 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458637 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.458839 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458694 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.458839 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458727 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.458839 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458768 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.458839 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458813 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kl8p\" (UniqueName: \"kubernetes.io/projected/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-kube-api-access-6kl8p\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.459109 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.458849 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.460036 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.460005 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.462080 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.462032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.462257 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.462213 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-hub\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.463350 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.463302 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-ca\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.463350 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.463342 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.469063 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.469043 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kl8p\" (UniqueName: \"kubernetes.io/projected/c58161b9-9f38-43ea-aa22-bc6c878b5bbc-kube-api-access-6kl8p\") pod \"cluster-proxy-proxy-agent-549659754-xjz4t\" (UID: \"c58161b9-9f38-43ea-aa22-bc6c878b5bbc\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.499461 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.499435 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:03.512277 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.512247 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" Apr 22 17:38:03.514994 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.514973 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz"] Apr 22 17:38:03.519116 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:38:03.519080 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6584a8e3_ddf4_4c36_85d4_dd40cf4f26af.slice/crio-24e53102bf6e372253b428dda3e8610f15576d10e819044910d54523f26cc590 WatchSource:0}: Error finding container 24e53102bf6e372253b428dda3e8610f15576d10e819044910d54523f26cc590: Status 404 returned error can't find the container with id 24e53102bf6e372253b428dda3e8610f15576d10e819044910d54523f26cc590 Apr 22 17:38:03.622651 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.622627 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9"] Apr 22 17:38:03.625470 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:38:03.625436 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda890df3b_4f93_45f6_a741_00411e68033c.slice/crio-814643985cb9b2135e8fa385dcc7540956024778504af9f65c7d3a1659dac08b WatchSource:0}: Error finding container 814643985cb9b2135e8fa385dcc7540956024778504af9f65c7d3a1659dac08b: Status 404 returned error can't find the container with id 814643985cb9b2135e8fa385dcc7540956024778504af9f65c7d3a1659dac08b Apr 22 17:38:03.642741 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:03.642720 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t"] Apr 22 17:38:03.644745 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:38:03.644723 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc58161b9_9f38_43ea_aa22_bc6c878b5bbc.slice/crio-99c497a31e11ff7443f5725da4775ec9992edec6d32e07146ed98f938441d4d4 WatchSource:0}: Error finding container 99c497a31e11ff7443f5725da4775ec9992edec6d32e07146ed98f938441d4d4: Status 404 returned error can't find the container with id 99c497a31e11ff7443f5725da4775ec9992edec6d32e07146ed98f938441d4d4 Apr 22 17:38:04.111572 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:04.111527 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" event={"ID":"c58161b9-9f38-43ea-aa22-bc6c878b5bbc","Type":"ContainerStarted","Data":"99c497a31e11ff7443f5725da4775ec9992edec6d32e07146ed98f938441d4d4"} Apr 22 17:38:04.112710 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:04.112684 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" event={"ID":"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af","Type":"ContainerStarted","Data":"24e53102bf6e372253b428dda3e8610f15576d10e819044910d54523f26cc590"} Apr 22 17:38:04.113918 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:04.113882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" event={"ID":"a890df3b-4f93-45f6-a741-00411e68033c","Type":"ContainerStarted","Data":"814643985cb9b2135e8fa385dcc7540956024778504af9f65c7d3a1659dac08b"} Apr 22 17:38:09.133063 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.133019 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" event={"ID":"c58161b9-9f38-43ea-aa22-bc6c878b5bbc","Type":"ContainerStarted","Data":"70c49b024ed1b01671af25b4ae1b38529a3ad6c1e6d01e48c9fc819ee8d01953"} Apr 22 17:38:09.134414 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.134372 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" event={"ID":"6584a8e3-ddf4-4c36-85d4-dd40cf4f26af","Type":"ContainerStarted","Data":"a53ae97daeca3f5d4e5b56e24a74f4a7f3bb5abdd8bf670127730ed1f9635e4c"} Apr 22 17:38:09.135865 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.135833 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" event={"ID":"a890df3b-4f93-45f6-a741-00411e68033c","Type":"ContainerStarted","Data":"350bea83bc2c879c5f894e63cb25ab9caef4fca72c3b64d172218e09b334887f"} Apr 22 17:38:09.136449 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.136430 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:09.138035 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.138009 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" Apr 22 17:38:09.169016 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.168960 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-784b7d68df-r6lgz" podStartSLOduration=1.139635521 podStartE2EDuration="6.168941376s" podCreationTimestamp="2026-04-22 17:38:03 +0000 UTC" firstStartedPulling="2026-04-22 17:38:03.521025746 +0000 UTC m=+220.657829102" lastFinishedPulling="2026-04-22 17:38:08.550331589 +0000 UTC m=+225.687134957" observedRunningTime="2026-04-22 17:38:09.167322535 +0000 UTC m=+226.304125913" watchObservedRunningTime="2026-04-22 17:38:09.168941376 +0000 UTC m=+226.305744753" Apr 22 17:38:09.187542 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:09.187488 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-669cb4f99d-59zd9" podStartSLOduration=1.252104525 podStartE2EDuration="6.187474888s" podCreationTimestamp="2026-04-22 17:38:03 +0000 UTC" firstStartedPulling="2026-04-22 17:38:03.627313903 +0000 UTC m=+220.764117257" lastFinishedPulling="2026-04-22 17:38:08.562684264 +0000 UTC m=+225.699487620" observedRunningTime="2026-04-22 17:38:09.186896592 +0000 UTC m=+226.323699968" watchObservedRunningTime="2026-04-22 17:38:09.187474888 +0000 UTC m=+226.324278264" Apr 22 17:38:11.143825 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:11.143761 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" event={"ID":"c58161b9-9f38-43ea-aa22-bc6c878b5bbc","Type":"ContainerStarted","Data":"922a1b2469d33d0b65677843853dd0ece2f4a2be778750f016f8ca819d7a4e90"} Apr 22 17:38:11.143825 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:11.143832 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" event={"ID":"c58161b9-9f38-43ea-aa22-bc6c878b5bbc","Type":"ContainerStarted","Data":"ffcad014f88a28bbf6bdde1b611163d671efd2222c19e9133fbd56b51e389449"} Apr 22 17:38:11.161609 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:11.161560 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-549659754-xjz4t" podStartSLOduration=1.466307673 podStartE2EDuration="8.161545632s" podCreationTimestamp="2026-04-22 17:38:03 +0000 UTC" firstStartedPulling="2026-04-22 17:38:03.646400736 +0000 UTC m=+220.783204090" lastFinishedPulling="2026-04-22 17:38:10.341638693 +0000 UTC m=+227.478442049" observedRunningTime="2026-04-22 17:38:11.160386379 +0000 UTC m=+228.297189754" watchObservedRunningTime="2026-04-22 17:38:11.161545632 +0000 UTC m=+228.298349007" Apr 22 17:38:36.393764 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.393731 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6ktbg"] Apr 22 17:38:36.397062 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.397046 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.400135 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400117 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 17:38:36.400245 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400159 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 17:38:36.400688 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400665 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 17:38:36.400688 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400686 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 17:38:36.400859 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400723 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 17:38:36.400859 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.400666 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-f77fc\"" Apr 22 17:38:36.405995 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.405977 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6ktbg"] Apr 22 17:38:36.497913 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.497871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c24024ee-26ef-43cf-8d99-9dcf8dc13813-cabundle0\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.498075 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.497938 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzg5\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-kube-api-access-phzg5\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.498075 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.497965 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.599204 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.599169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phzg5\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-kube-api-access-phzg5\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.599369 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.599217 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.599369 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.599252 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c24024ee-26ef-43cf-8d99-9dcf8dc13813-cabundle0\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.599369 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:36.599360 2565 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:38:36.599496 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:36.599377 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:38:36.599496 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:36.599386 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6ktbg: references non-existent secret key: ca.crt Apr 22 17:38:36.599496 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:36.599439 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates podName:c24024ee-26ef-43cf-8d99-9dcf8dc13813 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:37.099420093 +0000 UTC m=+254.236223446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates") pod "keda-operator-ffbb595cb-6ktbg" (UID: "c24024ee-26ef-43cf-8d99-9dcf8dc13813") : references non-existent secret key: ca.crt Apr 22 17:38:36.599822 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.599784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/c24024ee-26ef-43cf-8d99-9dcf8dc13813-cabundle0\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.617328 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.617301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzg5\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-kube-api-access-phzg5\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:36.962139 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.962102 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-7ppsl"] Apr 22 17:38:36.965136 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.965120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:36.967339 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.967318 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 17:38:36.974975 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:36.974955 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7ppsl"] Apr 22 17:38:37.001788 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.001759 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-certificates\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.001788 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.001789 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjrb\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-kube-api-access-wjjrb\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.102459 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.102428 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:37.102624 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.102485 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-certificates\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.102624 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.102508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjrb\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-kube-api-access-wjjrb\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.102624 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:37.102575 2565 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:38:37.102624 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:37.102597 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:38:37.102624 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:37.102608 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6ktbg: references non-existent secret key: ca.crt Apr 22 17:38:37.102871 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:37.102668 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates podName:c24024ee-26ef-43cf-8d99-9dcf8dc13813 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:38.102650408 +0000 UTC m=+255.239453761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates") pod "keda-operator-ffbb595cb-6ktbg" (UID: "c24024ee-26ef-43cf-8d99-9dcf8dc13813") : references non-existent secret key: ca.crt Apr 22 17:38:37.104851 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.104835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-certificates\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.110977 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.110956 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjrb\" (UniqueName: \"kubernetes.io/projected/72456849-7dfc-4103-8c44-e04ccab724fb-kube-api-access-wjjrb\") pod \"keda-admission-cf49989db-7ppsl\" (UID: \"72456849-7dfc-4103-8c44-e04ccab724fb\") " pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.274888 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.274789 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:37.389219 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:37.389185 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7ppsl"] Apr 22 17:38:37.392104 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:38:37.392080 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72456849_7dfc_4103_8c44_e04ccab724fb.slice/crio-ab2ee141ae5dbea763f0d83cffc66948ba26c2e4dc688cbb65badc3af5085a5d WatchSource:0}: Error finding container ab2ee141ae5dbea763f0d83cffc66948ba26c2e4dc688cbb65badc3af5085a5d: Status 404 returned error can't find the container with id ab2ee141ae5dbea763f0d83cffc66948ba26c2e4dc688cbb65badc3af5085a5d Apr 22 17:38:38.110236 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:38.110187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:38.110688 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:38.110335 2565 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:38:38.110688 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:38.110360 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:38:38.110688 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:38.110373 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6ktbg: references non-existent secret key: ca.crt Apr 22 17:38:38.110688 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:38.110454 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates podName:c24024ee-26ef-43cf-8d99-9dcf8dc13813 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:40.110434415 +0000 UTC m=+257.247237769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates") pod "keda-operator-ffbb595cb-6ktbg" (UID: "c24024ee-26ef-43cf-8d99-9dcf8dc13813") : references non-existent secret key: ca.crt Apr 22 17:38:38.208904 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:38.208842 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7ppsl" event={"ID":"72456849-7dfc-4103-8c44-e04ccab724fb","Type":"ContainerStarted","Data":"ab2ee141ae5dbea763f0d83cffc66948ba26c2e4dc688cbb65badc3af5085a5d"} Apr 22 17:38:40.129184 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:40.129102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:40.129510 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:40.129219 2565 secret.go:281] references non-existent secret key: ca.crt Apr 22 17:38:40.129510 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:40.129231 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 17:38:40.129510 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:40.129239 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6ktbg: references non-existent secret key: ca.crt Apr 22 17:38:40.129510 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:38:40.129285 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates podName:c24024ee-26ef-43cf-8d99-9dcf8dc13813 nodeName:}" failed. No retries permitted until 2026-04-22 17:38:44.129272967 +0000 UTC m=+261.266076320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates") pod "keda-operator-ffbb595cb-6ktbg" (UID: "c24024ee-26ef-43cf-8d99-9dcf8dc13813") : references non-existent secret key: ca.crt Apr 22 17:38:40.215481 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:40.215443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7ppsl" event={"ID":"72456849-7dfc-4103-8c44-e04ccab724fb","Type":"ContainerStarted","Data":"f6bf9c19e9c223d100f99deef3319384d01b2d0f9c5c2051e77d348352709cf9"} Apr 22 17:38:40.215718 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:40.215697 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:38:40.232315 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:40.232270 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-7ppsl" podStartSLOduration=1.9067774910000002 podStartE2EDuration="4.232228305s" podCreationTimestamp="2026-04-22 17:38:36 +0000 UTC" firstStartedPulling="2026-04-22 17:38:37.39330921 +0000 UTC m=+254.530112563" lastFinishedPulling="2026-04-22 17:38:39.718760024 +0000 UTC m=+256.855563377" observedRunningTime="2026-04-22 17:38:40.23065624 +0000 UTC m=+257.367459615" watchObservedRunningTime="2026-04-22 17:38:40.232228305 +0000 UTC m=+257.369031682" Apr 22 17:38:44.161883 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:44.161777 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:44.164272 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:44.164250 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c24024ee-26ef-43cf-8d99-9dcf8dc13813-certificates\") pod \"keda-operator-ffbb595cb-6ktbg\" (UID: \"c24024ee-26ef-43cf-8d99-9dcf8dc13813\") " pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:44.207534 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:44.207502 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:44.326836 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:44.326811 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6ktbg"] Apr 22 17:38:44.329362 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:38:44.329330 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24024ee_26ef_43cf_8d99_9dcf8dc13813.slice/crio-33443eef988ac4f414e57d864da2f0ed6b03cce36ad965d79872b9013fed2b63 WatchSource:0}: Error finding container 33443eef988ac4f414e57d864da2f0ed6b03cce36ad965d79872b9013fed2b63: Status 404 returned error can't find the container with id 33443eef988ac4f414e57d864da2f0ed6b03cce36ad965d79872b9013fed2b63 Apr 22 17:38:45.230379 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:45.230339 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" event={"ID":"c24024ee-26ef-43cf-8d99-9dcf8dc13813","Type":"ContainerStarted","Data":"33443eef988ac4f414e57d864da2f0ed6b03cce36ad965d79872b9013fed2b63"} Apr 22 17:38:48.240124 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:48.240090 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" event={"ID":"c24024ee-26ef-43cf-8d99-9dcf8dc13813","Type":"ContainerStarted","Data":"83d2af837d39cf3f291d7ff994f644c29fb938f44a08bd6eb96bd3d3fdf930c7"} Apr 22 17:38:48.240513 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:48.240208 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:38:48.255874 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:38:48.255830 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" podStartSLOduration=9.264617458 podStartE2EDuration="12.255816993s" podCreationTimestamp="2026-04-22 17:38:36 +0000 UTC" firstStartedPulling="2026-04-22 17:38:44.331005014 +0000 UTC m=+261.467808367" lastFinishedPulling="2026-04-22 17:38:47.322204549 +0000 UTC m=+264.459007902" observedRunningTime="2026-04-22 17:38:48.254522916 +0000 UTC m=+265.391326291" watchObservedRunningTime="2026-04-22 17:38:48.255816993 +0000 UTC m=+265.392620358" Apr 22 17:39:01.220347 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:01.220313 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-7ppsl" Apr 22 17:39:09.244760 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:09.244726 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-6ktbg" Apr 22 17:39:23.312638 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:23.312604 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:39:23.313303 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:23.313282 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:39:23.315103 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:23.315082 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:39:43.298348 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.298313 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv"] Apr 22 17:39:43.301758 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.301741 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:43.304004 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.303976 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 17:39:43.304596 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.304576 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 17:39:43.304596 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.304591 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-kxvnz\"" Apr 22 17:39:43.304745 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.304650 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 17:39:43.310333 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.310315 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv"] Apr 22 17:39:43.489334 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.489301 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:43.489510 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.489367 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnww\" (UniqueName: \"kubernetes.io/projected/d6f677ef-c35c-4768-b32e-dc21752cfe55-kube-api-access-fdnww\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:43.590440 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.590343 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnww\" (UniqueName: \"kubernetes.io/projected/d6f677ef-c35c-4768-b32e-dc21752cfe55-kube-api-access-fdnww\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:43.590440 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.590399 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:43.590670 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:39:43.590488 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 17:39:43.590670 ip-10-0-130-38 kubenswrapper[2565]: E0422 17:39:43.590547 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert podName:d6f677ef-c35c-4768-b32e-dc21752cfe55 nodeName:}" failed. No retries permitted until 2026-04-22 17:39:44.090532209 +0000 UTC m=+321.227335562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert") pod "llmisvc-controller-manager-68cc5db7c4-2ccfv" (UID: "d6f677ef-c35c-4768-b32e-dc21752cfe55") : secret "llmisvc-webhook-server-cert" not found Apr 22 17:39:43.604130 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:43.604098 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnww\" (UniqueName: \"kubernetes.io/projected/d6f677ef-c35c-4768-b32e-dc21752cfe55-kube-api-access-fdnww\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:44.093705 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.093670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:44.096024 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.095998 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6f677ef-c35c-4768-b32e-dc21752cfe55-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2ccfv\" (UID: \"d6f677ef-c35c-4768-b32e-dc21752cfe55\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:44.211922 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.211892 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:44.323661 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.323640 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv"] Apr 22 17:39:44.326041 ip-10-0-130-38 kubenswrapper[2565]: W0422 17:39:44.326011 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd6f677ef_c35c_4768_b32e_dc21752cfe55.slice/crio-d7dca58de25653b41a948a9cbb37dafd88a1e91e65532add96d9bba8e7252bd1 WatchSource:0}: Error finding container d7dca58de25653b41a948a9cbb37dafd88a1e91e65532add96d9bba8e7252bd1: Status 404 returned error can't find the container with id d7dca58de25653b41a948a9cbb37dafd88a1e91e65532add96d9bba8e7252bd1 Apr 22 17:39:44.327256 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.327235 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:39:44.377927 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:44.377851 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" event={"ID":"d6f677ef-c35c-4768-b32e-dc21752cfe55","Type":"ContainerStarted","Data":"d7dca58de25653b41a948a9cbb37dafd88a1e91e65532add96d9bba8e7252bd1"} Apr 22 17:39:46.384934 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:46.384896 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" event={"ID":"d6f677ef-c35c-4768-b32e-dc21752cfe55","Type":"ContainerStarted","Data":"7b6e9ea9c7a0e6f28ff6a15347680201b07de7a40b15d76ebc6fc30795cf10a2"} Apr 22 17:39:46.385389 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:46.385013 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:39:46.403794 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:39:46.400905 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" podStartSLOduration=1.554079005 podStartE2EDuration="3.400888468s" podCreationTimestamp="2026-04-22 17:39:43 +0000 UTC" firstStartedPulling="2026-04-22 17:39:44.327400932 +0000 UTC m=+321.464204284" lastFinishedPulling="2026-04-22 17:39:46.174210381 +0000 UTC m=+323.311013747" observedRunningTime="2026-04-22 17:39:46.399943516 +0000 UTC m=+323.536746893" watchObservedRunningTime="2026-04-22 17:39:46.400888468 +0000 UTC m=+323.537691844" Apr 22 17:40:17.390226 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:40:17.390146 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2ccfv" Apr 22 17:44:23.331749 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:44:23.331719 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:44:23.333072 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:44:23.333051 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:49:23.348237 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:49:23.348144 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:49:23.351555 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:49:23.351535 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:54:23.363514 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:54:23.363488 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:54:23.367662 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:54:23.367644 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:59:23.379326 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:59:23.379218 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 17:59:23.384643 ip-10-0-130-38 kubenswrapper[2565]: I0422 17:59:23.384626 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:04:23.396005 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:04:23.395978 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:04:23.401484 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:04:23.401462 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:09:23.412969 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:09:23.412864 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:09:23.418587 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:09:23.418569 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:14:23.429291 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:14:23.429172 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:14:23.435107 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:14:23.435091 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:19:23.444705 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:19:23.444574 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:19:23.451565 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:19:23.451545 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:24:23.461868 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:24:23.461703 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:24:23.469164 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:24:23.469143 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:29:23.477396 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:29:23.477289 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:29:23.485772 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:29:23.485750 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:34:23.492479 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:34:23.492369 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:34:23.501980 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:34:23.501962 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:39:23.508485 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:39:23.508376 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:39:23.520999 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:39:23.520979 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:44:23.524315 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:44:23.524218 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:44:23.539413 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:44:23.539395 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:49:23.539383 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:49:23.539253 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:49:23.555067 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:49:23.555045 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:54:23.554728 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:54:23.554613 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:54:23.571091 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:54:23.571060 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:59:23.570711 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:59:23.570606 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 18:59:23.587406 ip-10-0-130-38 kubenswrapper[2565]: I0422 18:59:23.587379 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 19:04:23.585658 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:04:23.585542 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 19:04:23.603215 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:04:23.603192 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-130-38.ec2.internal_3aad222c1b8d9495d15f07d93f8a83ad/kube-rbac-proxy-crio/2.log" Apr 22 19:05:03.547582 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:03.547548 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xxwlx_65112848-ca51-45d5-bc0f-36033c7a2b83/global-pull-secret-syncer/0.log" Apr 22 19:05:03.663090 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:03.663056 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mtlqw_4c5994d9-36a8-4d98-a1d9-67f743ccd167/konnectivity-agent/0.log" Apr 22 19:05:03.742483 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:03.742454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-38.ec2.internal_c98156a065fc898aadcfd0c780a265db/haproxy/0.log" Apr 22 19:05:07.793825 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:07.793776 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ztfbs_c7f9e817-7d88-4594-954b-87973d788ce2/node-exporter/0.log" Apr 22 19:05:07.814722 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:07.814693 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ztfbs_c7f9e817-7d88-4594-954b-87973d788ce2/kube-rbac-proxy/0.log" Apr 22 19:05:07.835643 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:07.835619 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ztfbs_c7f9e817-7d88-4594-954b-87973d788ce2/init-textfile/0.log" Apr 22 19:05:08.214316 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:08.214281 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-67869777f8-bx6q4_14b27c34-ba3c-43e8-afb8-4257a7f3a2b9/telemeter-client/0.log" Apr 22 19:05:08.239497 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:08.239466 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-67869777f8-bx6q4_14b27c34-ba3c-43e8-afb8-4257a7f3a2b9/reload/0.log" Apr 22 19:05:08.260655 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:08.260624 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-67869777f8-bx6q4_14b27c34-ba3c-43e8-afb8-4257a7f3a2b9/kube-rbac-proxy/0.log" Apr 22 19:05:10.787143 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.787104 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9"] Apr 22 19:05:10.790341 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.790319 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.792612 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.792580 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b8m7m\"/\"openshift-service-ca.crt\"" Apr 22 19:05:10.792754 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.792640 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b8m7m\"/\"kube-root-ca.crt\"" Apr 22 19:05:10.792754 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.792692 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b8m7m\"/\"default-dockercfg-vckz5\"" Apr 22 19:05:10.800241 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.800214 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9"] Apr 22 19:05:10.839650 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.839613 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-podres\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.839650 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.839647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kn6r\" (UniqueName: \"kubernetes.io/projected/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-kube-api-access-8kn6r\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.839894 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.839672 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-lib-modules\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.839894 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.839727 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-proc\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.839894 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.839747 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-sys\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.940848 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-podres\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.940848 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940852 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kn6r\" (UniqueName: \"kubernetes.io/projected/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-kube-api-access-8kn6r\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-lib-modules\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940913 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-podres\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-proc\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940973 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-proc\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.940975 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-sys\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.941016 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-sys\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.941063 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.941026 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-lib-modules\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:10.948390 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:10.948368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kn6r\" (UniqueName: \"kubernetes.io/projected/2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a-kube-api-access-8kn6r\") pod \"perf-node-gather-daemonset-f2ph9\" (UID: \"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a\") " pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:11.101024 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.100931 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:11.215362 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.215330 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9"] Apr 22 19:05:11.218456 ip-10-0-130-38 kubenswrapper[2565]: W0422 19:05:11.218424 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2ee01759_8ec4_4a90_a0d2_c8ceaf0dbe2a.slice/crio-d3360fa8e2eeee571c5fbfc138fe83e80fd55042e034c1d97f21fb49713ff88f WatchSource:0}: Error finding container d3360fa8e2eeee571c5fbfc138fe83e80fd55042e034c1d97f21fb49713ff88f: Status 404 returned error can't find the container with id d3360fa8e2eeee571c5fbfc138fe83e80fd55042e034c1d97f21fb49713ff88f Apr 22 19:05:11.220060 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.220043 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:05:11.497483 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.497448 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" event={"ID":"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a","Type":"ContainerStarted","Data":"e967269792734e562c87b2f7cfa458ea9879dc0353e6905fd3f3fa655bcb00b5"} Apr 22 19:05:11.497483 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.497490 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" event={"ID":"2ee01759-8ec4-4a90-a0d2-c8ceaf0dbe2a","Type":"ContainerStarted","Data":"d3360fa8e2eeee571c5fbfc138fe83e80fd55042e034c1d97f21fb49713ff88f"} Apr 22 19:05:11.497698 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.497586 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:11.501554 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.501532 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p466n_522cdf45-6f50-4e49-9db4-3eef0a5bd8cc/dns/0.log" Apr 22 19:05:11.514421 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.514372 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" podStartSLOduration=1.514354668 podStartE2EDuration="1.514354668s" podCreationTimestamp="2026-04-22 19:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:11.513516648 +0000 UTC m=+5448.650320022" watchObservedRunningTime="2026-04-22 19:05:11.514354668 +0000 UTC m=+5448.651158044" Apr 22 19:05:11.531279 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.531252 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p466n_522cdf45-6f50-4e49-9db4-3eef0a5bd8cc/kube-rbac-proxy/0.log" Apr 22 19:05:11.618583 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:11.618552 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kzckr_fc09482b-e40a-4db6-b3ae-ee8c7655cc83/dns-node-resolver/0.log" Apr 22 19:05:12.140867 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:12.140838 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qkkrr_bf130287-76e7-42eb-abee-d3c9dae69a49/node-ca/0.log" Apr 22 19:05:13.195427 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:13.195395 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vngqt_f25f12d8-df40-42af-a646-0c95f9e24e70/serve-healthcheck-canary/0.log" Apr 22 19:05:13.690457 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:13.690431 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gxzmr_facfd808-28c1-41e8-9141-4dbe2c1b8c77/kube-rbac-proxy/0.log" Apr 22 19:05:13.709996 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:13.709970 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gxzmr_facfd808-28c1-41e8-9141-4dbe2c1b8c77/exporter/0.log" Apr 22 19:05:13.732995 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:13.732975 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gxzmr_facfd808-28c1-41e8-9141-4dbe2c1b8c77/extractor/0.log" Apr 22 19:05:15.625180 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:15.625154 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2ccfv_d6f677ef-c35c-4768-b32e-dc21752cfe55/manager/0.log" Apr 22 19:05:17.509459 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:17.509429 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b8m7m/perf-node-gather-daemonset-f2ph9" Apr 22 19:05:20.882087 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.882053 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9s5s7_78349f6c-83ce-40df-b593-af2886a0a4aa/kube-multus/0.log" Apr 22 19:05:20.907707 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.907679 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/kube-multus-additional-cni-plugins/0.log" Apr 22 19:05:20.928485 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.928454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/egress-router-binary-copy/0.log" Apr 22 19:05:20.948203 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.948178 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/cni-plugins/0.log" Apr 22 19:05:20.967906 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.967880 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/bond-cni-plugin/0.log" Apr 22 19:05:20.987899 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:20.987874 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/routeoverride-cni/0.log" Apr 22 19:05:21.006903 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:21.006878 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/whereabouts-cni-bincopy/0.log" Apr 22 19:05:21.028735 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:21.028709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k2j7l_7df0412a-fc05-4359-8ed3-f1f7cc48a8eb/whereabouts-cni/0.log" Apr 22 19:05:21.468747 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:21.468715 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wqnlm_9b338a48-9f81-4b29-a9ed-ba400bd2e93d/network-metrics-daemon/0.log" Apr 22 19:05:21.489206 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:21.489180 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wqnlm_9b338a48-9f81-4b29-a9ed-ba400bd2e93d/kube-rbac-proxy/0.log" Apr 22 19:05:22.957020 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:22.956992 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/ovn-controller/0.log" Apr 22 19:05:22.998348 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:22.998320 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/ovn-acl-logging/0.log" Apr 22 19:05:23.014092 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.014071 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/kube-rbac-proxy-node/0.log" Apr 22 19:05:23.036425 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.036401 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:05:23.056561 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.056536 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/northd/0.log" Apr 22 19:05:23.076228 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.076202 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/nbdb/0.log" Apr 22 19:05:23.096333 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.096305 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/sbdb/0.log" Apr 22 19:05:23.196308 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:23.196275 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t45c7_0a131c7a-9e60-4af2-9909-c14202f6723e/ovnkube-controller/0.log" Apr 22 19:05:24.254477 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:24.254452 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vs8nl_b99a8ad0-f04e-465e-8515-c98de7f3e43d/network-check-target-container/0.log" Apr 22 19:05:25.118021 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:25.117990 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-r7w4r_963159e2-ff9f-47ac-9907-a22d03154537/iptables-alerter/0.log" Apr 22 19:05:25.748614 ip-10-0-130-38 kubenswrapper[2565]: I0422 19:05:25.748586 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tb9hn_8ce4426c-b706-426d-b90e-58d697c2b5b4/tuned/0.log"