Apr 20 11:39:23.357802 ip-10-0-135-68 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 11:39:23.357816 ip-10-0-135-68 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 11:39:23.357825 ip-10-0-135-68 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 11:39:23.358091 ip-10-0-135-68 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 11:39:33.404392 ip-10-0-135-68 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 11:39:33.404406 ip-10-0-135-68 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 779a5a4cf8d04cbe81ff4524a146ded5 -- Apr 20 11:42:04.077274 ip-10-0-135-68 systemd[1]: Starting Kubernetes Kubelet... Apr 20 11:42:04.472673 ip-10-0-135-68 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:42:04.472673 ip-10-0-135-68 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 11:42:04.472673 ip-10-0-135-68 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:42:04.472673 ip-10-0-135-68 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 11:42:04.472673 ip-10-0-135-68 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:42:04.474071 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.473978 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 11:42:04.476772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476758 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476774 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476779 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476782 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476785 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476789 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476792 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476795 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476798 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476801 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476804 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476807 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476810 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:42:04.476808 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476813 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476817 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476819 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476822 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476825 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476828 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476830 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476833 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476837 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476839 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476842 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476845 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476847 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476850 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476853 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476855 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476858 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476860 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476864 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476869 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:42:04.477133 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476871 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476874 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476876 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476879 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476881 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476884 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476886 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476889 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476891 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476894 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476896 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476898 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476901 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476903 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476907 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476910 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476913 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476916 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476919 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476921 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:42:04.477612 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476924 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476927 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476929 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476932 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476935 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476937 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476940 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476942 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476945 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476947 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476949 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476952 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476956 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476981 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476984 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476987 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476990 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476994 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476997 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:42:04.478137 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.476999 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477002 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477005 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477008 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477010 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477013 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477015 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477019 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477022 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477025 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477028 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477030 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477033 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:42:04.478633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.477036 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:42:04.479129 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479117 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:42:04.479129 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479128 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479132 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479135 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479138 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479141 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479144 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479147 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479149 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479152 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479154 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479157 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479159 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479162 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479165 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479167 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479170 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479172 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479175 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479177 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479181 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:42:04.479180 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479184 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479187 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479190 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479193 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479195 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479198 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479200 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479203 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479205 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479208 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479211 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479213 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479217 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479219 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479221 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479224 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479227 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479229 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479232 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479234 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:42:04.479657 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479237 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479239 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479242 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479244 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479246 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479249 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479251 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479254 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479256 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479259 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479261 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479264 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479267 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479270 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479272 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479276 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479278 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479281 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479283 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:42:04.480169 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479286 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479289 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479291 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479294 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479296 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479299 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479301 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479304 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479306 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479309 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479312 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479315 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479317 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479321 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479325 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479330 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479333 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479336 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479339 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:42:04.480633 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479342 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479345 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479348 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479351 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479354 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479356 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.479360 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479423 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479430 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479438 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479442 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479447 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479451 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479456 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479460 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479464 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479467 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479470 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479474 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479477 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479480 2564 flags.go:64] FLAG: --cgroup-root="" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479483 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479487 2564 flags.go:64] FLAG: --client-ca-file="" Apr 20 11:42:04.481121 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479490 2564 flags.go:64] FLAG: --cloud-config="" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479492 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479495 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479499 2564 flags.go:64] FLAG: --cluster-domain="" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479502 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479505 2564 flags.go:64] FLAG: --config-dir="" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479508 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479512 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479516 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479519 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479522 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479525 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479528 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479533 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479537 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479540 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479543 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479549 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479553 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479555 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479558 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479561 2564 flags.go:64] FLAG: --enable-server="true" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479564 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479569 2564 flags.go:64] FLAG: --event-burst="100" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479573 2564 flags.go:64] FLAG: --event-qps="50" Apr 20 11:42:04.481686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479576 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479579 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479582 2564 flags.go:64] FLAG: --eviction-hard="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479586 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479589 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479593 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479596 2564 flags.go:64] FLAG: --eviction-soft="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479599 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479602 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479605 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479608 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479611 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479614 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479616 2564 flags.go:64] FLAG: --feature-gates="" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479620 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479623 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479626 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479629 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479632 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479635 2564 flags.go:64] FLAG: --help="false" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479640 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479643 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479646 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 11:42:04.482312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479649 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479654 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479657 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479661 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479664 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479666 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479670 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479673 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479676 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479679 2564 flags.go:64] FLAG: --kube-reserved="" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479681 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479684 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479687 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479690 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479693 2564 flags.go:64] FLAG: --lock-file="" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479697 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479700 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479702 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479708 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479711 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479713 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479716 2564 flags.go:64] FLAG: --logging-format="text" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479719 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479722 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 11:42:04.482859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479725 2564 flags.go:64] FLAG: --manifest-url="" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479727 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479732 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479735 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479742 2564 flags.go:64] FLAG: --max-pods="110" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479745 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479748 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479751 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479754 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479758 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479761 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479764 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479771 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479774 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479777 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479780 2564 flags.go:64] FLAG: --pod-cidr="" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479783 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479789 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479792 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479795 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479798 2564 flags.go:64] FLAG: --port="10250" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479801 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479804 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e485d96b15009a2e" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479807 2564 flags.go:64] FLAG: --qos-reserved="" Apr 20 11:42:04.483444 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479811 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479814 2564 flags.go:64] FLAG: --register-node="true" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479817 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479820 2564 flags.go:64] FLAG: --register-with-taints="" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479824 2564 flags.go:64] FLAG: --registry-burst="10" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479826 2564 flags.go:64] FLAG: --registry-qps="5" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479829 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479832 2564 flags.go:64] FLAG: --reserved-memory="" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479836 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479839 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479842 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479844 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479847 2564 flags.go:64] FLAG: --runonce="false" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479850 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479853 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479857 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479860 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479862 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479866 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479869 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479872 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479875 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479877 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479880 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479883 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479886 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 11:42:04.484059 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479889 2564 flags.go:64] FLAG: --system-cgroups="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479892 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479897 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479901 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479903 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479908 2564 flags.go:64] FLAG: --tls-min-version="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479911 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479914 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479917 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479920 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479923 2564 flags.go:64] FLAG: --v="2" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479927 2564 flags.go:64] FLAG: --version="false" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479931 2564 flags.go:64] FLAG: --vmodule="" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479935 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.479939 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480043 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480047 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480050 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480053 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480056 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480059 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480062 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480065 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:42:04.484694 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480068 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480071 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480073 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480076 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480078 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480081 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480083 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480086 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480088 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480091 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480093 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480096 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480099 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480101 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480104 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480106 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480109 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480114 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480116 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480119 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:42:04.485252 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480122 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480124 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480127 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480136 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480139 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480141 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480144 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480146 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480149 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480152 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480154 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480157 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480159 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480163 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480166 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480169 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480172 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480174 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480177 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:42:04.485772 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480180 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480183 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480185 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480187 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480190 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480193 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480195 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480198 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480201 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480203 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480210 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480213 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480216 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480218 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480221 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480224 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480226 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480229 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480231 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480234 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:42:04.486484 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480236 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480239 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480243 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480245 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480248 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480251 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480254 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480256 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480259 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480261 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480265 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480269 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480272 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480275 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480277 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480280 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480283 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480285 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:42:04.487219 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.480288 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:42:04.487700 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.480876 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:42:04.488383 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.488364 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 11:42:04.488425 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.488384 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488437 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488443 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488446 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488449 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488453 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:42:04.488454 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488455 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488458 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488462 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488466 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488469 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488472 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488474 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488477 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488481 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488486 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488490 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488492 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488495 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488498 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488501 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488506 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488511 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488515 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488518 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:42:04.488601 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488521 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488523 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488526 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488528 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488531 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488534 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488536 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488539 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488542 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488544 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488547 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488549 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488552 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488555 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488557 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488560 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488563 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488566 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488568 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488572 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:42:04.489078 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488575 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488577 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488581 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488585 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488590 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488593 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488596 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488599 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488601 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488604 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488606 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488610 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488612 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488615 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488617 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488620 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488622 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488626 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488629 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:42:04.489577 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488631 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488633 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488636 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488638 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488641 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488643 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488647 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488649 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488653 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488658 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488663 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488668 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488671 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488675 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488677 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488680 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488682 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488685 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488688 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488690 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:42:04.490064 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488693 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488695 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488698 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.488703 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488824 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488830 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488833 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488836 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488839 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488841 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488844 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488847 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488849 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488852 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488854 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488857 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:42:04.490556 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488859 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488862 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488864 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488867 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488869 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488872 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488875 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488878 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488880 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488882 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488886 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488891 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488895 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488899 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488902 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488905 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488907 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488910 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488913 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:42:04.490951 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488916 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488918 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488921 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488925 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488929 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488932 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488934 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488938 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488940 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488943 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488946 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488948 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488950 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488953 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488955 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488972 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488977 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488981 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488983 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:42:04.491425 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488987 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488991 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488994 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488996 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.488999 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489003 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489006 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489009 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489011 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489014 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489017 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489020 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489022 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489026 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489029 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489031 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489034 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489037 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489039 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489042 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:42:04.491877 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489045 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489050 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489054 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489058 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489061 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489063 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489066 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489069 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489071 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489073 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489076 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489078 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489081 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489084 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489086 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:42:04.492458 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:04.489089 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:42:04.492818 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.489094 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:42:04.492818 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.490039 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 11:42:04.493331 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.493318 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 11:42:04.494190 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.494178 2564 server.go:1019] "Starting client certificate rotation" Apr 20 11:42:04.494288 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.494272 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:42:04.494321 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.494316 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:42:04.516791 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.516774 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:42:04.519449 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.519080 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:42:04.536194 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.536163 2564 log.go:25] "Validated CRI v1 runtime API" Apr 20 11:42:04.542010 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.541993 2564 log.go:25] "Validated CRI v1 image API" Apr 20 11:42:04.543262 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.543243 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 11:42:04.548298 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.548278 2564 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 862e1d6f-74a2-4052-907d-65cead1fcaac:/dev/nvme0n1p4 b4e90ccf-256e-4327-a8b8-b0141faab35f:/dev/nvme0n1p3] Apr 20 11:42:04.548364 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.548298 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 11:42:04.551095 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.551079 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:42:04.554320 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.554212 2564 manager.go:217] Machine: {Timestamp:2026-04-20 11:42:04.55235793 +0000 UTC m=+0.369620043 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3087284 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c4d471dd403d7e799b8eae214623b SystemUUID:ec2c4d47-1dd4-03d7-e799-b8eae214623b BootID:779a5a4c-f8d0-4cbe-81ff-4524a146ded5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:b5:c6:dd:8f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:b5:c6:dd:8f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:ff:e2:43:48:3d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 11:42:04.554320 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.554312 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 11:42:04.554451 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.554393 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 11:42:04.555306 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.555281 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 11:42:04.555440 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.555308 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-68.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 11:42:04.555482 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.555449 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 11:42:04.555482 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.555456 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 11:42:04.555482 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.555469 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:42:04.556451 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.556441 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:42:04.557372 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.557361 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:42:04.557486 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.557477 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 11:42:04.559476 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.559467 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 20 11:42:04.559513 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.559485 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 11:42:04.559513 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.559496 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 11:42:04.559513 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.559505 2564 kubelet.go:397] "Adding apiserver pod source" Apr 20 11:42:04.559636 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.559516 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 11:42:04.560487 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.560476 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:42:04.560531 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.560496 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:42:04.564040 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.564020 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 11:42:04.565430 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.565418 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 11:42:04.567288 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567276 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567293 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567299 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567304 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567310 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567316 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567322 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567327 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567339 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 11:42:04.567343 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567346 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 11:42:04.567570 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567354 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 11:42:04.567570 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.567363 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 11:42:04.568185 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.568176 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 11:42:04.568185 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.568185 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 11:42:04.571492 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571463 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lqndl" Apr 20 11:42:04.571698 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571687 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 11:42:04.571758 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571724 2564 server.go:1295] "Started kubelet" Apr 20 11:42:04.571820 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571784 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 11:42:04.571906 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571861 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 11:42:04.571943 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.571930 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 11:42:04.572533 ip-10-0-135-68 systemd[1]: Started Kubernetes Kubelet. Apr 20 11:42:04.573130 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.572989 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 11:42:04.573457 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.573379 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 11:42:04.573457 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.573413 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-68.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 11:42:04.573531 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.573493 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-68.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 11:42:04.574367 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.574355 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 20 11:42:04.581042 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.577927 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-68.ec2.internal.18a80de3b0bee9dc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-68.ec2.internal,UID:ip-10-0-135-68.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-68.ec2.internal,},FirstTimestamp:2026-04-20 11:42:04.571699676 +0000 UTC m=+0.388961788,LastTimestamp:2026-04-20 11:42:04.571699676 +0000 UTC m=+0.388961788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-68.ec2.internal,}" Apr 20 11:42:04.581243 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.581219 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lqndl" Apr 20 11:42:04.582112 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.582089 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 11:42:04.583602 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.583583 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 11:42:04.583694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.583626 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 11:42:04.584617 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584602 2564 factory.go:55] Registering systemd factory Apr 20 11:42:04.584675 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584628 2564 factory.go:223] Registration of the systemd container factory successfully Apr 20 11:42:04.584770 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584755 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 11:42:04.584829 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.584790 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.584876 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584828 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 11:42:04.584876 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584844 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 11:42:04.584995 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584917 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 20 11:42:04.584995 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.584927 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 20 11:42:04.585089 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585066 2564 factory.go:153] Registering CRI-O factory Apr 20 11:42:04.585089 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585080 2564 factory.go:223] Registration of the crio container factory successfully Apr 20 11:42:04.585179 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585134 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 11:42:04.585179 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585159 2564 factory.go:103] Registering Raw factory Apr 20 11:42:04.585179 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585174 2564 manager.go:1196] Started watching for new ooms in manager Apr 20 11:42:04.585801 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.585787 2564 manager.go:319] Starting recovery of all containers Apr 20 11:42:04.586070 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.586053 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:04.589083 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.589060 2564 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-68.ec2.internal\" not found" node="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.594927 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.594781 2564 manager.go:324] Recovery completed Apr 20 11:42:04.598729 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.598716 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.601378 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.601363 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.601438 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.601391 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.601438 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.601405 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.602276 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.602264 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 11:42:04.602276 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.602275 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 11:42:04.602349 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.602290 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:42:04.605696 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.605684 2564 policy_none.go:49] "None policy: Start" Apr 20 11:42:04.605732 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.605699 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 11:42:04.605732 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.605709 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 20 11:42:04.648494 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.648480 2564 manager.go:341] "Starting Device Plugin manager" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.648521 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.648534 2564 server.go:85] "Starting device plugin registration server" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.648809 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.648824 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.649072 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.649167 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.649175 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.649656 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 11:42:04.668260 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.649689 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.703447 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.703424 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 11:42:04.704569 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.704551 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 11:42:04.704638 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.704573 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 11:42:04.704638 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.704596 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 11:42:04.704638 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.704602 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 11:42:04.704638 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.704631 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 11:42:04.707776 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.707757 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:04.749519 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.749479 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.750255 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.750242 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.750312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.750273 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.750312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.750287 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.750312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.750309 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.758188 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.758175 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.758235 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.758196 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-68.ec2.internal\": node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.772969 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.772948 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.805206 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.805170 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal"] Apr 20 11:42:04.805281 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.805255 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.805976 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.805948 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.806054 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.805994 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.806054 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.806011 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.808359 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.808344 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.808490 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.808476 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.808541 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.808504 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.808992 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.808978 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.809073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.808997 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.809073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.809005 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.809073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.809017 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.809073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.809021 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.809073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.809033 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.811286 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.811269 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.811358 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.811303 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:42:04.812560 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.812545 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:42:04.812609 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.812573 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:42:04.812609 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.812583 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:42:04.831502 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.831485 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-68.ec2.internal\" not found" node="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.835707 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.835691 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-68.ec2.internal\" not found" node="ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.873061 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.873032 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.886602 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.886585 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/308db96245908b799112e2b3fae81b6e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-68.ec2.internal\" (UID: \"308db96245908b799112e2b3fae81b6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.886670 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.886606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.886670 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.886625 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.973794 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:04.973774 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:04.987179 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987161 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/308db96245908b799112e2b3fae81b6e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-68.ec2.internal\" (UID: \"308db96245908b799112e2b3fae81b6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.987250 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.987250 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987201 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.987250 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987226 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.987359 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987271 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0a247a66f4fbd6a4f2672923dfb347-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal\" (UID: \"7e0a247a66f4fbd6a4f2672923dfb347\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:04.987359 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:04.987269 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/308db96245908b799112e2b3fae81b6e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-68.ec2.internal\" (UID: \"308db96245908b799112e2b3fae81b6e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:05.074551 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.074490 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.133054 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.133022 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:05.138629 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.138612 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:05.175386 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.175366 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.275977 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.275926 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.376545 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.376475 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.477205 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.477169 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.494701 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.494679 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 11:42:05.494864 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.494839 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 11:42:05.494927 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.494845 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 11:42:05.577402 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.577373 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.584124 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.584095 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 11:37:04 +0000 UTC" deadline="2027-12-18 10:20:38.879218411 +0000 UTC" Apr 20 11:42:05.584124 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.584118 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14566h38m33.295103031s" Apr 20 11:42:05.584311 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.584153 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 11:42:05.593924 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.593889 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:42:05.620242 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.620199 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-cwbwh" Apr 20 11:42:05.629090 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.628997 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-cwbwh" Apr 20 11:42:05.677685 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.677654 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.728106 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:05.728056 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308db96245908b799112e2b3fae81b6e.slice/crio-3dd27a45c31e75e0fb290990a96da547ec70ae5105f32edbee4a794cef2fbbcc WatchSource:0}: Error finding container 3dd27a45c31e75e0fb290990a96da547ec70ae5105f32edbee4a794cef2fbbcc: Status 404 returned error can't find the container with id 3dd27a45c31e75e0fb290990a96da547ec70ae5105f32edbee4a794cef2fbbcc Apr 20 11:42:05.728679 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:05.728654 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0a247a66f4fbd6a4f2672923dfb347.slice/crio-8c9168719ab6ac64f6c32f1f28aa02cb8481c03a300e4f92bede2867e3235b88 WatchSource:0}: Error finding container 8c9168719ab6ac64f6c32f1f28aa02cb8481c03a300e4f92bede2867e3235b88: Status 404 returned error can't find the container with id 8c9168719ab6ac64f6c32f1f28aa02cb8481c03a300e4f92bede2867e3235b88 Apr 20 11:42:05.734026 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:05.734006 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 11:42:05.777994 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.777943 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.878540 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.878502 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:05.979105 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:05.979078 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-68.ec2.internal\" not found" Apr 20 11:42:06.050499 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.050464 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:06.079185 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.079151 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:06.084484 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.084455 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" Apr 20 11:42:06.096875 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.096846 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:42:06.097740 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.097721 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" Apr 20 11:42:06.106482 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.106446 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:42:06.444684 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.444603 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:06.560405 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.560378 2564 apiserver.go:52] "Watching apiserver" Apr 20 11:42:06.568653 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.568627 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 11:42:06.570503 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.570476 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-494p8","openshift-multus/multus-clqlc","openshift-network-diagnostics/network-check-target-dtzrg","kube-system/konnectivity-agent-vs4h8","openshift-image-registry/node-ca-rr7qt","openshift-multus/network-metrics-daemon-2x94v","openshift-network-operator/iptables-alerter-kh22r","openshift-ovn-kubernetes/ovnkube-node-h966s","kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk","openshift-cluster-node-tuning-operator/tuned-pjsl6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal"] Apr 20 11:42:06.573084 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.573058 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.575214 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.575194 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.576532 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.576166 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hzhl5\"" Apr 20 11:42:06.576532 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.576002 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 11:42:06.576678 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.576635 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.576730 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.576681 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 11:42:06.576883 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.576864 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.577050 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.577027 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 11:42:06.578440 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.577668 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:06.578440 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.577748 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:06.578440 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.578089 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 11:42:06.579581 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.579057 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xnlkj\"" Apr 20 11:42:06.581054 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.580882 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.583201 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.583181 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.583335 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.583286 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 11:42:06.583627 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.583611 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s669b\"" Apr 20 11:42:06.583722 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.583708 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 11:42:06.585475 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.585458 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.585562 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.585526 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:06.585789 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.585773 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.585923 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.585892 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 11:42:06.585923 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.585911 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ttmbb\"" Apr 20 11:42:06.586570 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.586552 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.587881 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.587853 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.590315 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.590294 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gdngf\"" Apr 20 11:42:06.590581 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.590563 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 11:42:06.590674 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.590566 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.590674 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.590607 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.592643 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.592622 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.592729 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.592666 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.595050 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.595012 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 11:42:06.595145 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.595121 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.596658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596611 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-bin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.596739 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596698 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.596739 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596674 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-hostroot\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.596824 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596746 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:06.596824 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596782 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.596824 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596820 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 11:42:06.596935 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596825 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9mx\" (UniqueName: \"kubernetes.io/projected/e519aca4-e5e7-4760-91b9-8488165051df-kube-api-access-nm9mx\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.596935 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596848 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-etc-kubernetes\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.596935 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596883 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c5f2fae-93fd-4644-bbf1-784c37431e5c-agent-certs\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.596935 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596904 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c5f2fae-93fd-4644-bbf1-784c37431e5c-konnectivity-ca\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.597139 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.596948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqgrm\" (UniqueName: \"kubernetes.io/projected/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-kube-api-access-zqgrm\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.597139 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597009 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cnibin\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597139 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597048 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cni-binary-copy\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597139 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597091 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597139 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597196 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597222 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cnibin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-os-release\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597282 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-daemon-config\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-host\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-system-cni-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597381 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597371 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pl2j\" (UniqueName: \"kubernetes.io/projected/17e25242-c8c6-4d80-a45e-22b47b30d8f0-kube-api-access-4pl2j\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597415 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-multus\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597437 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597471 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-socket-dir-parent\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597541 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-k8s-cni-cncf-io\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597566 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-kubelet\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r678\" (UniqueName: \"kubernetes.io/projected/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-kube-api-access-5r678\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597633 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-serviceca\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597645 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sgnmv\"" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597670 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-system-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.597694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597694 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-netns\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.598160 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597711 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-conf-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.598160 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597737 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-multus-certs\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.598160 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.597782 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-os-release\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.598160 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.598122 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.599168 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.598858 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 11:42:06.599600 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.599581 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 11:42:06.599692 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.599668 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.599754 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.599694 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.599885 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.599850 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:06.599885 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.599862 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z5tr9\"" Apr 20 11:42:06.600416 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.600276 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 11:42:06.600416 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.600299 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gbzm5\"" Apr 20 11:42:06.602582 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.602563 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 11:42:06.629650 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.629606 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:37:05 +0000 UTC" deadline="2028-01-12 02:45:57.848233808 +0000 UTC" Apr 20 11:42:06.629650 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.629649 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15159h3m51.218589582s" Apr 20 11:42:06.685560 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.685525 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 11:42:06.698119 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698030 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-tmp\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.698119 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698067 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-serviceca\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.698119 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698097 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-system-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698123 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-netns\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698147 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-conf-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698179 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-netns\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698203 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-sys\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698200 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-system-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698235 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-netns\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698248 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-os-release\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698246 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-conf-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698294 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-node-log\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698297 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-os-release\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698328 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-hostroot\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698363 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698361 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-hostroot\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698373 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-config\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698406 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-socket-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698474 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-var-lib-kubelet\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698499 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-log-socket\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698529 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698556 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9mx\" (UniqueName: \"kubernetes.io/projected/e519aca4-e5e7-4760-91b9-8488165051df-kube-api-access-nm9mx\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698581 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c5f2fae-93fd-4644-bbf1-784c37431e5c-konnectivity-ca\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698608 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-etc-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-serviceca\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698644 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-ovn\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.698798 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.698686 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.698849 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.699005 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:07.198956236 +0000 UTC m=+3.016218343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwvz8\" (UniqueName: \"kubernetes.io/projected/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-kube-api-access-fwvz8\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699152 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c5f2fae-93fd-4644-bbf1-784c37431e5c-konnectivity-ca\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699169 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-netd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699194 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a66e511e-e6c2-4ef0-b401-588d90851a5e-host-slash\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699225 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.699273 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699271 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-cni-dir\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cnibin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699371 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-os-release\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-daemon-config\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699398 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cnibin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699409 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699436 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-systemd\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-run\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699495 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-system-cni-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699523 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pl2j\" (UniqueName: \"kubernetes.io/projected/17e25242-c8c6-4d80-a45e-22b47b30d8f0-kube-api-access-4pl2j\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699551 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699575 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-bin\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699600 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-env-overrides\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699623 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-sys-fs\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699637 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.699658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699647 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4vv8\" (UniqueName: \"kubernetes.io/projected/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kube-api-access-l4vv8\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699679 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-conf\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699710 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-socket-dir-parent\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699737 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-kubelet\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699765 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-var-lib-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699795 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699820 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-tuned\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699847 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-multus-certs\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699850 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-daemon-config\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699859 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-os-release\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699893 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-system-cni-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699911 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-lib-modules\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.699948 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-multus-certs\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700009 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-bin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700010 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-multus-socket-dir-parent\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700063 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700109 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-bin\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700319 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700115 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-systemd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjcf\" (UniqueName: \"kubernetes.io/projected/fb6400e5-c705-43eb-867b-42fa8fc8a750-kube-api-access-mgjcf\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700277 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-registration-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-host\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-etc-kubernetes\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700378 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c5f2fae-93fd-4644-bbf1-784c37431e5c-agent-certs\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-systemd-units\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700431 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-etc-kubernetes\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700456 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhxhm\" (UniqueName: \"kubernetes.io/projected/a66e511e-e6c2-4ef0-b401-588d90851a5e-kube-api-access-zhxhm\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700499 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-modprobe-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqgrm\" (UniqueName: \"kubernetes.io/projected/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-kube-api-access-zqgrm\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cnibin\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700625 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cni-binary-copy\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700657 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700732 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-slash\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700744 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 11:42:06.700929 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700762 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-host\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700792 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700845 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-multus\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-host\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovn-node-metrics-cert\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700889 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cnibin\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.700900 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-script-lib\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701002 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a66e511e-e6c2-4ef0-b401-588d90851a5e-iptables-alerter-script\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701030 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-device-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701057 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-kubernetes\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701057 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17e25242-c8c6-4d80-a45e-22b47b30d8f0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-cni-multus\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701125 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701192 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-k8s-cni-cncf-io\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701221 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-kubelet\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701205 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-cni-binary-copy\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701248 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r678\" (UniqueName: \"kubernetes.io/projected/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-kube-api-access-5r678\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.701608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701303 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-var-lib-kubelet\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.702390 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701348 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-host-run-k8s-cni-cncf-io\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.702390 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701390 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysconfig\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.702390 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701595 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.702390 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.701706 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17e25242-c8c6-4d80-a45e-22b47b30d8f0-cni-binary-copy\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.704441 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.704421 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c5f2fae-93fd-4644-bbf1-784c37431e5c-agent-certs\") pod \"konnectivity-agent-vs4h8\" (UID: \"7c5f2fae-93fd-4644-bbf1-784c37431e5c\") " pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.706583 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.706556 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:06.706583 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.706580 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:06.706741 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.706593 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:06.706741 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:06.706648 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:07.206630666 +0000 UTC m=+3.023892772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:06.709123 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.709095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pl2j\" (UniqueName: \"kubernetes.io/projected/17e25242-c8c6-4d80-a45e-22b47b30d8f0-kube-api-access-4pl2j\") pod \"multus-additional-cni-plugins-494p8\" (UID: \"17e25242-c8c6-4d80-a45e-22b47b30d8f0\") " pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.709243 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.709188 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" event={"ID":"308db96245908b799112e2b3fae81b6e","Type":"ContainerStarted","Data":"3dd27a45c31e75e0fb290990a96da547ec70ae5105f32edbee4a794cef2fbbcc"} Apr 20 11:42:06.709505 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.709483 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9mx\" (UniqueName: \"kubernetes.io/projected/e519aca4-e5e7-4760-91b9-8488165051df-kube-api-access-nm9mx\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:06.710418 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.710396 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" event={"ID":"7e0a247a66f4fbd6a4f2672923dfb347","Type":"ContainerStarted","Data":"8c9168719ab6ac64f6c32f1f28aa02cb8481c03a300e4f92bede2867e3235b88"} Apr 20 11:42:06.715278 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.715257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqgrm\" (UniqueName: \"kubernetes.io/projected/62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4-kube-api-access-zqgrm\") pod \"node-ca-rr7qt\" (UID: \"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4\") " pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.715365 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.715313 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r678\" (UniqueName: \"kubernetes.io/projected/a7b767da-cf50-4bbd-ac5b-c35078f5cc35-kube-api-access-5r678\") pod \"multus-clqlc\" (UID: \"a7b767da-cf50-4bbd-ac5b-c35078f5cc35\") " pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.722525 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.722505 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:42:06.802210 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-config\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802210 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802215 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-socket-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802264 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-var-lib-kubelet\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802292 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-log-socket\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802318 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802335 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-etc-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802422 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-var-lib-kubelet\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802458 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-ovn\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-socket-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802483 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802486 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-log-socket\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802489 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-etc-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802501 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwvz8\" (UniqueName: \"kubernetes.io/projected/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-kube-api-access-fwvz8\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802506 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-ovn\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802525 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-netd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802563 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a66e511e-e6c2-4ef0-b401-588d90851a5e-host-slash\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802589 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802613 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-systemd\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802621 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a66e511e-e6c2-4ef0-b401-588d90851a5e-host-slash\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802628 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802638 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-run\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802667 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802676 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-systemd\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802678 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-run\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.802716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802612 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-netd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802695 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-bin\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802703 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-etc-selinux\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802719 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-env-overrides\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802720 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802735 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-sys-fs\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-cni-bin\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4vv8\" (UniqueName: \"kubernetes.io/projected/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kube-api-access-l4vv8\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802797 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-conf\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802806 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-config\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802836 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-kubelet\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802875 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-sys-fs\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802875 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-var-lib-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802904 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-var-lib-openvswitch\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802891 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-kubelet\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802915 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802946 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.803379 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-tuned\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysctl-conf\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.802990 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-lib-modules\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803026 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-systemd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803064 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-run-systemd\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803091 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjcf\" (UniqueName: \"kubernetes.io/projected/fb6400e5-c705-43eb-867b-42fa8fc8a750-kube-api-access-mgjcf\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803103 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-lib-modules\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-registration-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803138 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-host\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803164 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803189 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-systemd-units\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-registration-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803199 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-env-overrides\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803217 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhxhm\" (UniqueName: \"kubernetes.io/projected/a66e511e-e6c2-4ef0-b401-588d90851a5e-kube-api-access-zhxhm\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803237 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-modprobe-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803239 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-host\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803258 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-slash\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804156 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803255 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-ovn-kubernetes\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803297 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-systemd-units\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803306 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-slash\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803324 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovn-node-metrics-cert\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803328 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-modprobe-d\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803366 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-script-lib\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a66e511e-e6c2-4ef0-b401-588d90851a5e-iptables-alerter-script\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803429 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-device-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803452 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-kubernetes\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803482 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysconfig\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803487 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-device-dir\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803505 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-tmp\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803535 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-netns\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803538 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-kubernetes\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803557 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-sys\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803616 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-sys\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803661 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-host-run-netns\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803708 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-sysconfig\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.804911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803817 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-node-log\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.805736 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803882 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb6400e5-c705-43eb-867b-42fa8fc8a750-node-log\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.805736 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a66e511e-e6c2-4ef0-b401-588d90851a5e-iptables-alerter-script\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.805736 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.803905 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovnkube-script-lib\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.805914 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.805868 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6400e5-c705-43eb-867b-42fa8fc8a750-ovn-node-metrics-cert\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.805955 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.805914 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-etc-tuned\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.806028 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.805956 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-tmp\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.812028 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.811973 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhxhm\" (UniqueName: \"kubernetes.io/projected/a66e511e-e6c2-4ef0-b401-588d90851a5e-kube-api-access-zhxhm\") pod \"iptables-alerter-kh22r\" (UID: \"a66e511e-e6c2-4ef0-b401-588d90851a5e\") " pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.812384 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.812348 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwvz8\" (UniqueName: \"kubernetes.io/projected/4ca5c7b9-b35a-4b83-a47b-3463b5058f23-kube-api-access-fwvz8\") pod \"tuned-pjsl6\" (UID: \"4ca5c7b9-b35a-4b83-a47b-3463b5058f23\") " pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:06.812498 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.812430 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4vv8\" (UniqueName: \"kubernetes.io/projected/b0190d50-a6ca-42f6-b113-2acc79ed7c3e-kube-api-access-l4vv8\") pod \"aws-ebs-csi-driver-node-flxvk\" (UID: \"b0190d50-a6ca-42f6-b113-2acc79ed7c3e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.812730 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.812709 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjcf\" (UniqueName: \"kubernetes.io/projected/fb6400e5-c705-43eb-867b-42fa8fc8a750-kube-api-access-mgjcf\") pod \"ovnkube-node-h966s\" (UID: \"fb6400e5-c705-43eb-867b-42fa8fc8a750\") " pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.885535 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.885502 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-494p8" Apr 20 11:42:06.894693 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.894661 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-clqlc" Apr 20 11:42:06.904393 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.904361 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:06.910922 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.910902 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rr7qt" Apr 20 11:42:06.920553 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.920526 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kh22r" Apr 20 11:42:06.927206 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.927181 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:06.932873 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.932848 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" Apr 20 11:42:06.938518 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:06.938490 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" Apr 20 11:42:07.206334 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.206299 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:07.206538 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.206414 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:07.206538 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.206479 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:08.206459426 +0000 UTC m=+4.023721539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:07.307308 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.307260 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:07.307461 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.307434 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:07.307461 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.307459 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:07.307547 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.307474 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:07.307595 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:07.307568 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:08.307547652 +0000 UTC m=+4.124809777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:07.389438 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.389406 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62b2a4c2_1ab3_4cf8_8d8d_10675c84e9d4.slice/crio-ae60efdad14136e84197932e4c6baa13f1fe80b30fc0819ad3bfa9dab8649e49 WatchSource:0}: Error finding container ae60efdad14136e84197932e4c6baa13f1fe80b30fc0819ad3bfa9dab8649e49: Status 404 returned error can't find the container with id ae60efdad14136e84197932e4c6baa13f1fe80b30fc0819ad3bfa9dab8649e49 Apr 20 11:42:07.391760 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.391634 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e25242_c8c6_4d80_a45e_22b47b30d8f0.slice/crio-1971ab7fc0a5983350b0104488f062e6c0486da7dc6bc41a76c3faaf2158264e WatchSource:0}: Error finding container 1971ab7fc0a5983350b0104488f062e6c0486da7dc6bc41a76c3faaf2158264e: Status 404 returned error can't find the container with id 1971ab7fc0a5983350b0104488f062e6c0486da7dc6bc41a76c3faaf2158264e Apr 20 11:42:07.393912 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.393813 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda66e511e_e6c2_4ef0_b401_588d90851a5e.slice/crio-16cec55ca233e0fd153115b6b8ce9ef63580e434816cfc725ff37ec0c077b53f WatchSource:0}: Error finding container 16cec55ca233e0fd153115b6b8ce9ef63580e434816cfc725ff37ec0c077b53f: Status 404 returned error can't find the container with id 16cec55ca233e0fd153115b6b8ce9ef63580e434816cfc725ff37ec0c077b53f Apr 20 11:42:07.395193 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.395156 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6400e5_c705_43eb_867b_42fa8fc8a750.slice/crio-639e01c2a55a0cb0e523af1d079869857d5768488ed682cfad6736fc5076f42f WatchSource:0}: Error finding container 639e01c2a55a0cb0e523af1d079869857d5768488ed682cfad6736fc5076f42f: Status 404 returned error can't find the container with id 639e01c2a55a0cb0e523af1d079869857d5768488ed682cfad6736fc5076f42f Apr 20 11:42:07.396204 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.396181 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca5c7b9_b35a_4b83_a47b_3463b5058f23.slice/crio-151f2c357f2d94cde7fa17e1d40dfa4ad541e810e24cfd24c0cf55a53cbdb180 WatchSource:0}: Error finding container 151f2c357f2d94cde7fa17e1d40dfa4ad541e810e24cfd24c0cf55a53cbdb180: Status 404 returned error can't find the container with id 151f2c357f2d94cde7fa17e1d40dfa4ad541e810e24cfd24c0cf55a53cbdb180 Apr 20 11:42:07.397084 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.397046 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5f2fae_93fd_4644_bbf1_784c37431e5c.slice/crio-8a339ad7ef08a5c7113d6507607e6761a817717e938b8e37846c26d4242a6ddd WatchSource:0}: Error finding container 8a339ad7ef08a5c7113d6507607e6761a817717e938b8e37846c26d4242a6ddd: Status 404 returned error can't find the container with id 8a339ad7ef08a5c7113d6507607e6761a817717e938b8e37846c26d4242a6ddd Apr 20 11:42:07.398568 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.398127 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b767da_cf50_4bbd_ac5b_c35078f5cc35.slice/crio-181c491a2d921afc3301d5f718723811d962976f77c1f493d4d71f99557d5a21 WatchSource:0}: Error finding container 181c491a2d921afc3301d5f718723811d962976f77c1f493d4d71f99557d5a21: Status 404 returned error can't find the container with id 181c491a2d921afc3301d5f718723811d962976f77c1f493d4d71f99557d5a21 Apr 20 11:42:07.399053 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:07.399037 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0190d50_a6ca_42f6_b113_2acc79ed7c3e.slice/crio-e526d7d3cd48eaea03109076881072179f3445a38f6cb3d861741735529e9e92 WatchSource:0}: Error finding container e526d7d3cd48eaea03109076881072179f3445a38f6cb3d861741735529e9e92: Status 404 returned error can't find the container with id e526d7d3cd48eaea03109076881072179f3445a38f6cb3d861741735529e9e92 Apr 20 11:42:07.630896 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.630716 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:37:05 +0000 UTC" deadline="2027-12-25 14:37:29.461257711 +0000 UTC" Apr 20 11:42:07.630896 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.630889 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14738h55m21.830371495s" Apr 20 11:42:07.712983 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.712926 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" event={"ID":"b0190d50-a6ca-42f6-b113-2acc79ed7c3e","Type":"ContainerStarted","Data":"e526d7d3cd48eaea03109076881072179f3445a38f6cb3d861741735529e9e92"} Apr 20 11:42:07.714228 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.714188 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"639e01c2a55a0cb0e523af1d079869857d5768488ed682cfad6736fc5076f42f"} Apr 20 11:42:07.716383 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.716190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kh22r" event={"ID":"a66e511e-e6c2-4ef0-b401-588d90851a5e","Type":"ContainerStarted","Data":"16cec55ca233e0fd153115b6b8ce9ef63580e434816cfc725ff37ec0c077b53f"} Apr 20 11:42:07.717377 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.717301 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerStarted","Data":"1971ab7fc0a5983350b0104488f062e6c0486da7dc6bc41a76c3faaf2158264e"} Apr 20 11:42:07.720536 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.720508 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" event={"ID":"308db96245908b799112e2b3fae81b6e","Type":"ContainerStarted","Data":"b7181bf042ec23e2ab7551091e192b2a34f7e1c503e66196232dc4a554df4941"} Apr 20 11:42:07.722722 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.722695 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-clqlc" event={"ID":"a7b767da-cf50-4bbd-ac5b-c35078f5cc35","Type":"ContainerStarted","Data":"181c491a2d921afc3301d5f718723811d962976f77c1f493d4d71f99557d5a21"} Apr 20 11:42:07.724454 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.724371 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vs4h8" event={"ID":"7c5f2fae-93fd-4644-bbf1-784c37431e5c","Type":"ContainerStarted","Data":"8a339ad7ef08a5c7113d6507607e6761a817717e938b8e37846c26d4242a6ddd"} Apr 20 11:42:07.726362 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.726340 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" event={"ID":"4ca5c7b9-b35a-4b83-a47b-3463b5058f23","Type":"ContainerStarted","Data":"151f2c357f2d94cde7fa17e1d40dfa4ad541e810e24cfd24c0cf55a53cbdb180"} Apr 20 11:42:07.727581 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:07.727556 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rr7qt" event={"ID":"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4","Type":"ContainerStarted","Data":"ae60efdad14136e84197932e4c6baa13f1fe80b30fc0819ad3bfa9dab8649e49"} Apr 20 11:42:08.214362 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.214276 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:08.214521 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.214457 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:08.214578 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.214526 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:10.214505752 +0000 UTC m=+6.031767865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:08.315684 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.315648 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:08.315825 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.315805 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:08.315825 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.315823 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:08.315934 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.315835 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:08.315934 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.315891 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:10.315871549 +0000 UTC m=+6.133133652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:08.706578 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.706114 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:08.706578 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.706237 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:08.706578 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.706506 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:08.707161 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:08.706606 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:08.737088 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.737051 2564 generic.go:358] "Generic (PLEG): container finished" podID="7e0a247a66f4fbd6a4f2672923dfb347" containerID="e0a24ff27878e853fbcba69650a89cda87f7ed3df050209ae39eaab47f5e0883" exitCode=0 Apr 20 11:42:08.737607 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.737581 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" event={"ID":"7e0a247a66f4fbd6a4f2672923dfb347","Type":"ContainerDied","Data":"e0a24ff27878e853fbcba69650a89cda87f7ed3df050209ae39eaab47f5e0883"} Apr 20 11:42:08.752190 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:08.752139 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-68.ec2.internal" podStartSLOduration=2.75211974 podStartE2EDuration="2.75211974s" podCreationTimestamp="2026-04-20 11:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:07.734005451 +0000 UTC m=+3.551267573" watchObservedRunningTime="2026-04-20 11:42:08.75211974 +0000 UTC m=+4.569381864" Apr 20 11:42:09.746103 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:09.746066 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" event={"ID":"7e0a247a66f4fbd6a4f2672923dfb347","Type":"ContainerStarted","Data":"4ea64469ef7da7476fcb4230680af110a0a971c951d35d242585c66e7b37a5af"} Apr 20 11:42:10.232920 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:10.232374 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:10.232920 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.232553 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:10.232920 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.232618 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:14.232603931 +0000 UTC m=+10.049866036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:10.333323 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:10.333279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:10.333497 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.333480 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:10.333579 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.333504 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:10.333579 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.333518 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:10.333689 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.333582 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:14.333563279 +0000 UTC m=+10.150825393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:10.705551 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:10.705471 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:10.707893 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.706771 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:10.707893 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:10.707305 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:10.707893 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:10.707472 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:12.705345 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:12.705309 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:12.705899 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:12.705317 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:12.705899 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:12.705453 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:12.705899 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:12.705516 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:14.264518 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.264475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:14.265078 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.264647 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:14.265078 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.264722 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:22.264701172 +0000 UTC m=+18.081963285 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:14.365219 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.365168 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:14.365411 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.365386 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:14.365411 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.365410 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:14.365601 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.365424 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:14.365601 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.365494 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:22.365472442 +0000 UTC m=+18.182734542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:14.591700 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.591601 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-68.ec2.internal" podStartSLOduration=8.591582159 podStartE2EDuration="8.591582159s" podCreationTimestamp="2026-04-20 11:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:09.763865661 +0000 UTC m=+5.581127785" watchObservedRunningTime="2026-04-20 11:42:14.591582159 +0000 UTC m=+10.408844282" Apr 20 11:42:14.592216 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.592031 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-229kk"] Apr 20 11:42:14.595150 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.595126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.598309 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.598284 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v6bf2\"" Apr 20 11:42:14.598430 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.598290 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 11:42:14.598492 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.598435 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 11:42:14.667851 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.667809 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/216d056e-aca9-4614-b69d-9c50c5631c89-hosts-file\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.668035 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.667857 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frh5r\" (UniqueName: \"kubernetes.io/projected/216d056e-aca9-4614-b69d-9c50c5631c89-kube-api-access-frh5r\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.668035 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.667894 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/216d056e-aca9-4614-b69d-9c50c5631c89-tmp-dir\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.706564 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.705956 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:14.706564 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.706092 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:14.706564 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.706435 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:14.706564 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:14.706522 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:14.769043 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.768489 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/216d056e-aca9-4614-b69d-9c50c5631c89-hosts-file\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.769043 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.768541 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frh5r\" (UniqueName: \"kubernetes.io/projected/216d056e-aca9-4614-b69d-9c50c5631c89-kube-api-access-frh5r\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.769043 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.768582 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/216d056e-aca9-4614-b69d-9c50c5631c89-tmp-dir\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.769043 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.768797 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/216d056e-aca9-4614-b69d-9c50c5631c89-hosts-file\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.769043 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.769001 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/216d056e-aca9-4614-b69d-9c50c5631c89-tmp-dir\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.779744 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.779714 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frh5r\" (UniqueName: \"kubernetes.io/projected/216d056e-aca9-4614-b69d-9c50c5631c89-kube-api-access-frh5r\") pod \"node-resolver-229kk\" (UID: \"216d056e-aca9-4614-b69d-9c50c5631c89\") " pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:14.908037 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:14.907725 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-229kk" Apr 20 11:42:16.705307 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:16.705267 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:16.705765 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:16.705267 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:16.705765 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:16.705435 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:16.705765 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:16.705481 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:18.705462 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:18.705422 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:18.705927 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:18.705422 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:18.705927 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:18.705538 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:18.705927 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:18.705671 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:20.705654 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:20.705622 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:20.706114 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:20.705632 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:20.706114 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:20.705761 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:20.706114 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:20.705819 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:22.326021 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:22.325984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:22.326468 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.326113 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:22.326468 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.326184 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:42:38.326163591 +0000 UTC m=+34.143425705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:22.426492 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:22.426454 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:22.426700 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.426627 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:22.426700 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.426651 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:22.426700 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.426661 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:22.426869 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.426720 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:38.426702322 +0000 UTC m=+34.243964422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:22.705778 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:22.705695 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:22.705931 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:22.705694 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:22.705931 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.705815 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:22.705931 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:22.705908 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:24.706678 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.706251 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:24.707318 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.706290 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:24.707318 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:24.706813 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:24.707318 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:24.706925 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:24.774244 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.774212 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" event={"ID":"b0190d50-a6ca-42f6-b113-2acc79ed7c3e","Type":"ContainerStarted","Data":"7c2c30ec3ae62dc80de4363c31ef361dd073774fdcd0e0ff3b1a19dae2a05976"} Apr 20 11:42:24.775578 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.775561 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:42:24.775841 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.775822 2564 generic.go:358] "Generic (PLEG): container finished" podID="fb6400e5-c705-43eb-867b-42fa8fc8a750" containerID="f0d61f9e855f347d9c98e738a3e5a15ab620b87c5d3d77cbaaee57b3abebecff" exitCode=1 Apr 20 11:42:24.775925 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.775884 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"fcb48e87203b000761755260f549ffb4fb6eb004e9ea6547787f75abc7c77357"} Apr 20 11:42:24.775925 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.775909 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerDied","Data":"f0d61f9e855f347d9c98e738a3e5a15ab620b87c5d3d77cbaaee57b3abebecff"} Apr 20 11:42:24.776073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.775926 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"83f627a78b76b1fd42a7410ccef07b8295345e529c3c752bffda8d16c0739131"} Apr 20 11:42:24.777148 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.777128 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="0072f5a84515932990bb6ee4bfcdb0aceecff22f6c5155b77cd99f15ed39aa36" exitCode=0 Apr 20 11:42:24.777226 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.777194 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"0072f5a84515932990bb6ee4bfcdb0aceecff22f6c5155b77cd99f15ed39aa36"} Apr 20 11:42:24.778485 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.778459 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-clqlc" event={"ID":"a7b767da-cf50-4bbd-ac5b-c35078f5cc35","Type":"ContainerStarted","Data":"daef0af97411e70b11c7c23219a3323e7951226f34ce4c756ef667a8a78892ef"} Apr 20 11:42:24.779641 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.779616 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vs4h8" event={"ID":"7c5f2fae-93fd-4644-bbf1-784c37431e5c","Type":"ContainerStarted","Data":"aea4cdaf30eb02e1748c52ad13b8f5cb6c5a54d264a8cb1834128417b628b261"} Apr 20 11:42:24.780871 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.780852 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" event={"ID":"4ca5c7b9-b35a-4b83-a47b-3463b5058f23","Type":"ContainerStarted","Data":"086f6352d9b1b248b5d2d6614e5d964f2ca8677008259d6ef62e3f04aa670098"} Apr 20 11:42:24.782308 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.782290 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rr7qt" event={"ID":"62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4","Type":"ContainerStarted","Data":"3b90ab6b081f646584d1528e8a853dd861508b6479551c813e8b5888930ab361"} Apr 20 11:42:24.783527 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.783510 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-229kk" event={"ID":"216d056e-aca9-4614-b69d-9c50c5631c89","Type":"ContainerStarted","Data":"0ec2f05c3e9ff071cd7be308145b642532ff16712f21d5265ab524491bdc6f1b"} Apr 20 11:42:24.783591 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.783531 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-229kk" event={"ID":"216d056e-aca9-4614-b69d-9c50c5631c89","Type":"ContainerStarted","Data":"2edd65ed58419d782128ff8ddcdc572ef7341bf25672307bf0e89ab91c71af52"} Apr 20 11:42:24.823473 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.823426 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pjsl6" podStartSLOduration=4.268398631 podStartE2EDuration="20.8234132s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.398328861 +0000 UTC m=+3.215590961" lastFinishedPulling="2026-04-20 11:42:23.953343412 +0000 UTC m=+19.770605530" observedRunningTime="2026-04-20 11:42:24.823405857 +0000 UTC m=+20.640667979" watchObservedRunningTime="2026-04-20 11:42:24.8234132 +0000 UTC m=+20.640675322" Apr 20 11:42:24.839767 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.839727 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rr7qt" podStartSLOduration=4.278161499 podStartE2EDuration="20.839710462s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.391884736 +0000 UTC m=+3.209146855" lastFinishedPulling="2026-04-20 11:42:23.953433704 +0000 UTC m=+19.770695818" observedRunningTime="2026-04-20 11:42:24.839417465 +0000 UTC m=+20.656679586" watchObservedRunningTime="2026-04-20 11:42:24.839710462 +0000 UTC m=+20.656972584" Apr 20 11:42:24.853945 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.853905 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-229kk" podStartSLOduration=10.853892081 podStartE2EDuration="10.853892081s" podCreationTimestamp="2026-04-20 11:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:24.853667258 +0000 UTC m=+20.670929380" watchObservedRunningTime="2026-04-20 11:42:24.853892081 +0000 UTC m=+20.671154202" Apr 20 11:42:24.873628 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.873581 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-clqlc" podStartSLOduration=4.170096992 podStartE2EDuration="20.873570871s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.424403028 +0000 UTC m=+3.241665140" lastFinishedPulling="2026-04-20 11:42:24.127876915 +0000 UTC m=+19.945139019" observedRunningTime="2026-04-20 11:42:24.873166664 +0000 UTC m=+20.690428786" watchObservedRunningTime="2026-04-20 11:42:24.873570871 +0000 UTC m=+20.690832992" Apr 20 11:42:24.891228 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:24.891006 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vs4h8" podStartSLOduration=4.361955615 podStartE2EDuration="20.890994418s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.424306729 +0000 UTC m=+3.241568832" lastFinishedPulling="2026-04-20 11:42:23.953345524 +0000 UTC m=+19.770607635" observedRunningTime="2026-04-20 11:42:24.890825835 +0000 UTC m=+20.708087956" watchObservedRunningTime="2026-04-20 11:42:24.890994418 +0000 UTC m=+20.708256530" Apr 20 11:42:25.696751 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.696720 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 11:42:25.787423 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.787389 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" event={"ID":"b0190d50-a6ca-42f6-b113-2acc79ed7c3e","Type":"ContainerStarted","Data":"9e7a2e4088ab752668f7970cc38441801b5d9ac0a313f17fce22e0220f6e65cb"} Apr 20 11:42:25.790274 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.790250 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:42:25.790737 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.790705 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"312b098850621f7b2184b2f7a21d302f3eec023c71cd92f3f4e64d4a44fb3854"} Apr 20 11:42:25.790812 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.790746 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"2bd95c4f38fd626569db8bfde56c5eceba0e5ca500cdabe84d4c7c11092340cf"} Apr 20 11:42:25.790812 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.790761 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"67827aa194f81653d225d685c6544455988e9e3f3ab7c71eccba5a6b62e8c1d9"} Apr 20 11:42:25.792202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.792169 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kh22r" event={"ID":"a66e511e-e6c2-4ef0-b401-588d90851a5e","Type":"ContainerStarted","Data":"99fe6ed20f5df816bbc9fc209b8379976268189534d47086f20e25d8c1760094"} Apr 20 11:42:25.806839 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:25.806795 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kh22r" podStartSLOduration=5.09208859 podStartE2EDuration="21.806782288s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.3960936 +0000 UTC m=+3.213355715" lastFinishedPulling="2026-04-20 11:42:24.110787309 +0000 UTC m=+19.928049413" observedRunningTime="2026-04-20 11:42:25.806493195 +0000 UTC m=+21.623755318" watchObservedRunningTime="2026-04-20 11:42:25.806782288 +0000 UTC m=+21.624044410" Apr 20 11:42:26.658855 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:26.658739 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T11:42:25.696744005Z","UUID":"61eebc53-9d1f-4ade-b15a-fbc77513213f","Handler":null,"Name":"","Endpoint":""} Apr 20 11:42:26.662092 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:26.662068 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 11:42:26.662092 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:26.662099 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 11:42:26.705393 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:26.705360 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:26.705576 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:26.705366 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:26.705576 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:26.705470 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:26.705690 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:26.705592 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:27.798632 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.798351 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" event={"ID":"b0190d50-a6ca-42f6-b113-2acc79ed7c3e","Type":"ContainerStarted","Data":"021c9bbaa7fffda3240390c39bd7e68a40ba518f0230d8259eeb3dcdee3c03c6"} Apr 20 11:42:27.801151 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.801126 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:42:27.801476 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.801455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"ad17fc868d933e8806f1f00f2ad646c5e5c05b3d35c1a8b6b3746317658a26c9"} Apr 20 11:42:27.817624 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.817576 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-flxvk" podStartSLOduration=4.416954268 podStartE2EDuration="23.817560605s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.424325204 +0000 UTC m=+3.241587306" lastFinishedPulling="2026-04-20 11:42:26.824931529 +0000 UTC m=+22.642193643" observedRunningTime="2026-04-20 11:42:27.817328376 +0000 UTC m=+23.634590497" watchObservedRunningTime="2026-04-20 11:42:27.817560605 +0000 UTC m=+23.634822727" Apr 20 11:42:27.949908 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.949870 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:27.950773 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:27.950751 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:28.704979 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:28.704942 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:28.705152 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:28.705055 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:28.705152 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:28.705097 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:28.705152 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:28.705150 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:28.803220 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:28.803185 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:28.803738 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:28.803574 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vs4h8" Apr 20 11:42:29.807222 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.807056 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:42:29.807918 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.807514 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"a71d7016987f99e3ddc345f24ea2843c28f19a58dd18353dfd93c5b99fc143e4"} Apr 20 11:42:29.807918 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.807887 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:29.807918 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.807910 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:29.808086 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.807989 2564 scope.go:117] "RemoveContainer" containerID="f0d61f9e855f347d9c98e738a3e5a15ab620b87c5d3d77cbaaee57b3abebecff" Apr 20 11:42:29.809152 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.809125 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="8145a50eccace22b8c49ce63b7247cdc91d7ca7c36a5b02489d553485e763d05" exitCode=0 Apr 20 11:42:29.809241 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.809202 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"8145a50eccace22b8c49ce63b7247cdc91d7ca7c36a5b02489d553485e763d05"} Apr 20 11:42:29.823429 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:29.823409 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:30.707737 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.707706 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:30.707881 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.707706 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:30.707881 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:30.707815 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:30.707996 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:30.707880 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:30.815311 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.815288 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:42:30.815703 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.815640 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" event={"ID":"fb6400e5-c705-43eb-867b-42fa8fc8a750","Type":"ContainerStarted","Data":"6702bcc9808e3dc4e2e7c43ded5a2be10bb8bf0772420481a20d1e690ee1e142"} Apr 20 11:42:30.816075 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.816047 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:30.829989 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.829951 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:42:30.883347 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:30.883302 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" podStartSLOduration=10.123844513 podStartE2EDuration="26.883288032s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.397584477 +0000 UTC m=+3.214846577" lastFinishedPulling="2026-04-20 11:42:24.157027984 +0000 UTC m=+19.974290096" observedRunningTime="2026-04-20 11:42:30.852656342 +0000 UTC m=+26.669918464" watchObservedRunningTime="2026-04-20 11:42:30.883288032 +0000 UTC m=+26.700550166" Apr 20 11:42:31.269779 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.269749 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dtzrg"] Apr 20 11:42:31.269991 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.269883 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:31.270054 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:31.269998 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:31.270452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.270387 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2x94v"] Apr 20 11:42:31.270566 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.270510 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:31.270640 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:31.270617 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:31.819264 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.819229 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="e9d6c46236bdd6b69bab360ba1312e26c2fbecd1100471fe4a6b479be6b5f0cf" exitCode=0 Apr 20 11:42:31.819676 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:31.819318 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"e9d6c46236bdd6b69bab360ba1312e26c2fbecd1100471fe4a6b479be6b5f0cf"} Apr 20 11:42:32.707498 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:32.707320 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:32.707652 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:32.707365 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:32.707652 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:32.707589 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:32.707652 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:32.707622 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:33.824411 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:33.824370 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="3ce61b397e54c844cf6ec6758eddcd93098f682b1504e151a1716c62c84bb9cc" exitCode=0 Apr 20 11:42:33.824736 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:33.824414 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"3ce61b397e54c844cf6ec6758eddcd93098f682b1504e151a1716c62c84bb9cc"} Apr 20 11:42:34.705724 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:34.705689 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:34.705906 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:34.705794 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:34.705906 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:34.705881 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:34.706035 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:34.706010 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:36.708667 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:36.708634 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:36.709213 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:36.708636 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:36.709213 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:36.708788 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2x94v" podUID="e519aca4-e5e7-4760-91b9-8488165051df" Apr 20 11:42:36.709213 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:36.708824 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dtzrg" podUID="5ba5cc41-47d4-412c-9ba2-40b85e919af7" Apr 20 11:42:37.042304 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.042274 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-68.ec2.internal" event="NodeReady" Apr 20 11:42:37.042452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.042395 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 11:42:37.093185 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.093153 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kj8zj"] Apr 20 11:42:37.117392 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.117363 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8wx4k"] Apr 20 11:42:37.117537 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.117507 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.120476 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.120380 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 11:42:37.120476 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.120408 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 11:42:37.120648 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.120499 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5crpn\"" Apr 20 11:42:37.133696 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.133675 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kj8zj"] Apr 20 11:42:37.133772 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.133707 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wx4k"] Apr 20 11:42:37.133772 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.133732 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.136452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.136424 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-twcdr\"" Apr 20 11:42:37.136452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.136441 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 11:42:37.136452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.136449 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 11:42:37.136747 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.136733 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 11:42:37.237202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237171 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfhc\" (UniqueName: \"kubernetes.io/projected/8ce36e9b-db64-4394-94fd-daf59b2e178f-kube-api-access-pbfhc\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.237202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ce36e9b-db64-4394-94fd-daf59b2e178f-config-volume\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.237395 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237283 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ce36e9b-db64-4394-94fd-daf59b2e178f-tmp-dir\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.237395 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237306 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.237395 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237342 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdckj\" (UniqueName: \"kubernetes.io/projected/e81b43aa-165a-4aaf-bced-4a43a6902437-kube-api-access-rdckj\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.237395 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.237379 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.337981 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.337919 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdckj\" (UniqueName: \"kubernetes.io/projected/e81b43aa-165a-4aaf-bced-4a43a6902437-kube-api-access-rdckj\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.338168 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338040 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.338168 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338071 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfhc\" (UniqueName: \"kubernetes.io/projected/8ce36e9b-db64-4394-94fd-daf59b2e178f-kube-api-access-pbfhc\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.338168 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ce36e9b-db64-4394-94fd-daf59b2e178f-config-volume\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.338168 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338161 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ce36e9b-db64-4394-94fd-daf59b2e178f-tmp-dir\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.338359 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.338359 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.338189 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:37.338359 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.338249 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls podName:8ce36e9b-db64-4394-94fd-daf59b2e178f nodeName:}" failed. No retries permitted until 2026-04-20 11:42:37.83822953 +0000 UTC m=+33.655491648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls") pod "dns-default-kj8zj" (UID: "8ce36e9b-db64-4394-94fd-daf59b2e178f") : secret "dns-default-metrics-tls" not found Apr 20 11:42:37.338359 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.338283 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:37.338359 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.338343 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert podName:e81b43aa-165a-4aaf-bced-4a43a6902437 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:37.838331594 +0000 UTC m=+33.655593709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert") pod "ingress-canary-8wx4k" (UID: "e81b43aa-165a-4aaf-bced-4a43a6902437") : secret "canary-serving-cert" not found Apr 20 11:42:37.338603 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338526 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ce36e9b-db64-4394-94fd-daf59b2e178f-tmp-dir\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.338769 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.338743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ce36e9b-db64-4394-94fd-daf59b2e178f-config-volume\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.354637 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.354606 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdckj\" (UniqueName: \"kubernetes.io/projected/e81b43aa-165a-4aaf-bced-4a43a6902437-kube-api-access-rdckj\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.357295 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.357259 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfhc\" (UniqueName: \"kubernetes.io/projected/8ce36e9b-db64-4394-94fd-daf59b2e178f-kube-api-access-pbfhc\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.638990 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.638880 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx"] Apr 20 11:42:37.666104 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.666066 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj"] Apr 20 11:42:37.666258 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.666226 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.670534 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.670505 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 11:42:37.670853 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.670832 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 11:42:37.670983 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.670895 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 11:42:37.672013 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.671985 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 11:42:37.672116 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.672099 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-ss2hf\"" Apr 20 11:42:37.679858 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.679831 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx"] Apr 20 11:42:37.679858 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.679870 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj"] Apr 20 11:42:37.680073 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.680002 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.684859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.684824 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 11:42:37.685147 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.685130 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 11:42:37.685743 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.685727 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 11:42:37.685874 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.685857 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 11:42:37.741033 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.741000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rkdv\" (UniqueName: \"kubernetes.io/projected/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-kube-api-access-6rkdv\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.741033 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.741035 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.841991 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.841937 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.842202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842010 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.842202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:37.842202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.842202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842123 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6hs\" (UniqueName: \"kubernetes.io/projected/25c9df4d-8145-4f9d-9f3c-484b02ad0811-kube-api-access-sb6hs\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.842202 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.842194 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842195 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842263 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.842296 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert podName:e81b43aa-165a-4aaf-bced-4a43a6902437 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:38.842261933 +0000 UTC m=+34.659524034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert") pod "ingress-canary-8wx4k" (UID: "e81b43aa-165a-4aaf-bced-4a43a6902437") : secret "canary-serving-cert" not found Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.842349 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:37.842395 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls podName:8ce36e9b-db64-4394-94fd-daf59b2e178f nodeName:}" failed. No retries permitted until 2026-04-20 11:42:38.842380255 +0000 UTC m=+34.659642368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls") pod "dns-default-kj8zj" (UID: "8ce36e9b-db64-4394-94fd-daf59b2e178f") : secret "dns-default-metrics-tls" not found Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rkdv\" (UniqueName: \"kubernetes.io/projected/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-kube-api-access-6rkdv\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.842464 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842447 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.842806 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.842475 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.845720 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.845690 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.851556 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.851524 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rkdv\" (UniqueName: \"kubernetes.io/projected/1a2ffa64-a1ed-4ff8-a999-d03733bb9da4-kube-api-access-6rkdv\") pod \"managed-serviceaccount-addon-agent-f688f4b59-tbncx\" (UID: \"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:37.943451 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.943451 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6hs\" (UniqueName: \"kubernetes.io/projected/25c9df4d-8145-4f9d-9f3c-484b02ad0811-kube-api-access-sb6hs\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.943668 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.943668 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.943668 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943556 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.943668 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.943581 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.944198 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.944178 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.946088 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.946067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.952586 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.952564 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6hs\" (UniqueName: \"kubernetes.io/projected/25c9df4d-8145-4f9d-9f3c-484b02ad0811-kube-api-access-sb6hs\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.953808 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.953780 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.954041 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.954020 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-hub\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.954103 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.954076 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/25c9df4d-8145-4f9d-9f3c-484b02ad0811-ca\") pod \"cluster-proxy-proxy-agent-bc559486b-bgmwj\" (UID: \"25c9df4d-8145-4f9d-9f3c-484b02ad0811\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:37.992188 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:37.992090 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" Apr 20 11:42:38.001060 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.001035 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:42:38.346658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.346614 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:38.346845 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.346792 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:38.346899 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.346878 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs podName:e519aca4-e5e7-4760-91b9-8488165051df nodeName:}" failed. No retries permitted until 2026-04-20 11:43:10.346857061 +0000 UTC m=+66.164119161 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs") pod "network-metrics-daemon-2x94v" (UID: "e519aca4-e5e7-4760-91b9-8488165051df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:38.447371 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.447337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:38.447576 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.447518 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:38.447576 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.447547 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:38.447576 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.447562 2564 projected.go:194] Error preparing data for projected volume kube-api-access-c72s5 for pod openshift-network-diagnostics/network-check-target-dtzrg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:38.447794 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.447623 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5 podName:5ba5cc41-47d4-412c-9ba2-40b85e919af7 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:10.447609364 +0000 UTC m=+66.264871468 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c72s5" (UniqueName: "kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5") pod "network-check-target-dtzrg" (UID: "5ba5cc41-47d4-412c-9ba2-40b85e919af7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:38.704956 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.704865 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:42:38.704956 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.704901 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:42:38.708066 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.708044 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 11:42:38.708558 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.708538 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 11:42:38.709415 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.709397 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 11:42:38.709542 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.709524 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:42:38.709872 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.709857 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dmnrq\"" Apr 20 11:42:38.850635 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.850597 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:38.851160 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:38.850713 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:38.851160 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.850776 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:38.851160 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.850808 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:38.851160 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.850845 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls podName:8ce36e9b-db64-4394-94fd-daf59b2e178f nodeName:}" failed. No retries permitted until 2026-04-20 11:42:40.850826569 +0000 UTC m=+36.668088690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls") pod "dns-default-kj8zj" (UID: "8ce36e9b-db64-4394-94fd-daf59b2e178f") : secret "dns-default-metrics-tls" not found Apr 20 11:42:38.851160 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:38.850864 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert podName:e81b43aa-165a-4aaf-bced-4a43a6902437 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:40.850854591 +0000 UTC m=+36.668116693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert") pod "ingress-canary-8wx4k" (UID: "e81b43aa-165a-4aaf-bced-4a43a6902437") : secret "canary-serving-cert" not found Apr 20 11:42:39.564042 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:39.563842 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx"] Apr 20 11:42:39.567718 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:39.567693 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj"] Apr 20 11:42:39.568357 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:39.568332 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a2ffa64_a1ed_4ff8_a999_d03733bb9da4.slice/crio-684d473d40914a496b754e556c170929545cdfe4435172ee2b04de1fb769313d WatchSource:0}: Error finding container 684d473d40914a496b754e556c170929545cdfe4435172ee2b04de1fb769313d: Status 404 returned error can't find the container with id 684d473d40914a496b754e556c170929545cdfe4435172ee2b04de1fb769313d Apr 20 11:42:39.741014 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:39.740978 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25c9df4d_8145_4f9d_9f3c_484b02ad0811.slice/crio-3e7e08ac0b23e827aa58ff0670edf422ef596544dc43bd36ff7b03abffa8ae52 WatchSource:0}: Error finding container 3e7e08ac0b23e827aa58ff0670edf422ef596544dc43bd36ff7b03abffa8ae52: Status 404 returned error can't find the container with id 3e7e08ac0b23e827aa58ff0670edf422ef596544dc43bd36ff7b03abffa8ae52 Apr 20 11:42:39.836812 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:39.836776 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" event={"ID":"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4","Type":"ContainerStarted","Data":"684d473d40914a496b754e556c170929545cdfe4435172ee2b04de1fb769313d"} Apr 20 11:42:39.837789 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:39.837761 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerStarted","Data":"3e7e08ac0b23e827aa58ff0670edf422ef596544dc43bd36ff7b03abffa8ae52"} Apr 20 11:42:40.843859 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:40.843821 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="82a7554cad8feed0f4e0886afbcc814fde4b85d661610a1aa8610afeb85b7c43" exitCode=0 Apr 20 11:42:40.844312 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:40.843869 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"82a7554cad8feed0f4e0886afbcc814fde4b85d661610a1aa8610afeb85b7c43"} Apr 20 11:42:40.864436 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:40.864395 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:40.864574 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:40.864470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:40.864646 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:40.864583 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:40.864704 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:40.864645 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls podName:8ce36e9b-db64-4394-94fd-daf59b2e178f nodeName:}" failed. No retries permitted until 2026-04-20 11:42:44.864624505 +0000 UTC m=+40.681886606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls") pod "dns-default-kj8zj" (UID: "8ce36e9b-db64-4394-94fd-daf59b2e178f") : secret "dns-default-metrics-tls" not found Apr 20 11:42:40.864704 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:40.864583 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:40.864816 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:42:40.864736 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert podName:e81b43aa-165a-4aaf-bced-4a43a6902437 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:44.864713835 +0000 UTC m=+40.681975941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert") pod "ingress-canary-8wx4k" (UID: "e81b43aa-165a-4aaf-bced-4a43a6902437") : secret "canary-serving-cert" not found Apr 20 11:42:41.848832 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:41.848800 2564 generic.go:358] "Generic (PLEG): container finished" podID="17e25242-c8c6-4d80-a45e-22b47b30d8f0" containerID="f1b255c7066ef38e03a71dd7b88dfa30597d12aeaddc75be8858a0a94e091f4f" exitCode=0 Apr 20 11:42:41.849275 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:41.848876 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerDied","Data":"f1b255c7066ef38e03a71dd7b88dfa30597d12aeaddc75be8858a0a94e091f4f"} Apr 20 11:42:42.853399 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:42.853153 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" event={"ID":"1a2ffa64-a1ed-4ff8-a999-d03733bb9da4","Type":"ContainerStarted","Data":"5c51aa6d4f980da89fb799994f50fa4bdff505a6c0866309f830966855847c0b"} Apr 20 11:42:42.856689 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:42.856657 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-494p8" event={"ID":"17e25242-c8c6-4d80-a45e-22b47b30d8f0","Type":"ContainerStarted","Data":"b53044b9f0cc3a11ed11ac41f8973f0233b29397423e79c3860329bb4137bb37"} Apr 20 11:42:42.908364 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:42.908268 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f688f4b59-tbncx" podStartSLOduration=2.9213684559999997 podStartE2EDuration="5.908251719s" podCreationTimestamp="2026-04-20 11:42:37 +0000 UTC" firstStartedPulling="2026-04-20 11:42:39.725311515 +0000 UTC m=+35.542573615" lastFinishedPulling="2026-04-20 11:42:42.712194777 +0000 UTC m=+38.529456878" observedRunningTime="2026-04-20 11:42:42.878756048 +0000 UTC m=+38.696018179" watchObservedRunningTime="2026-04-20 11:42:42.908251719 +0000 UTC m=+38.725513841" Apr 20 11:42:42.908691 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:42.908669 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-494p8" podStartSLOduration=6.508364771 podStartE2EDuration="38.908664855s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:07.394827458 +0000 UTC m=+3.212089558" lastFinishedPulling="2026-04-20 11:42:39.795127539 +0000 UTC m=+35.612389642" observedRunningTime="2026-04-20 11:42:42.906901803 +0000 UTC m=+38.724163937" watchObservedRunningTime="2026-04-20 11:42:42.908664855 +0000 UTC m=+38.725927023" Apr 20 11:42:44.898396 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.898355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:44.898918 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.898444 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:44.902366 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.902335 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e81b43aa-165a-4aaf-bced-4a43a6902437-cert\") pod \"ingress-canary-8wx4k\" (UID: \"e81b43aa-165a-4aaf-bced-4a43a6902437\") " pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:44.902772 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.902749 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ce36e9b-db64-4394-94fd-daf59b2e178f-metrics-tls\") pod \"dns-default-kj8zj\" (UID: \"8ce36e9b-db64-4394-94fd-daf59b2e178f\") " pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:44.927318 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.927285 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:44.941577 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:44.941547 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wx4k" Apr 20 11:42:45.077129 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:45.077104 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kj8zj"] Apr 20 11:42:45.078261 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:45.078236 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce36e9b_db64_4394_94fd_daf59b2e178f.slice/crio-f98c0a61b10f961ebbe4d5f73e985bc5241ed9def2ff068ce2e9bfb87187ebed WatchSource:0}: Error finding container f98c0a61b10f961ebbe4d5f73e985bc5241ed9def2ff068ce2e9bfb87187ebed: Status 404 returned error can't find the container with id f98c0a61b10f961ebbe4d5f73e985bc5241ed9def2ff068ce2e9bfb87187ebed Apr 20 11:42:45.089606 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:45.089571 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wx4k"] Apr 20 11:42:45.093344 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:45.093314 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode81b43aa_165a_4aaf_bced_4a43a6902437.slice/crio-80f744704816a73a4b2541db9c80731cb4c09839801f8529fc64cbf8b7edc07a WatchSource:0}: Error finding container 80f744704816a73a4b2541db9c80731cb4c09839801f8529fc64cbf8b7edc07a: Status 404 returned error can't find the container with id 80f744704816a73a4b2541db9c80731cb4c09839801f8529fc64cbf8b7edc07a Apr 20 11:42:45.864823 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:45.864772 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wx4k" event={"ID":"e81b43aa-165a-4aaf-bced-4a43a6902437","Type":"ContainerStarted","Data":"80f744704816a73a4b2541db9c80731cb4c09839801f8529fc64cbf8b7edc07a"} Apr 20 11:42:45.866246 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:45.866207 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj8zj" event={"ID":"8ce36e9b-db64-4394-94fd-daf59b2e178f","Type":"ContainerStarted","Data":"f98c0a61b10f961ebbe4d5f73e985bc5241ed9def2ff068ce2e9bfb87187ebed"} Apr 20 11:42:47.872608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.872516 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj8zj" event={"ID":"8ce36e9b-db64-4394-94fd-daf59b2e178f","Type":"ContainerStarted","Data":"ba3da9b496664d95b694e9f9b7a2671480ac2abe9f4bf3e967680b710dcd3f63"} Apr 20 11:42:47.872608 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.872555 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kj8zj" event={"ID":"8ce36e9b-db64-4394-94fd-daf59b2e178f","Type":"ContainerStarted","Data":"b9f93e7383f0dd45e5a8553da8f93dfab4c3099e84859002c7c45cca393ba798"} Apr 20 11:42:47.873118 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.872611 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:47.873899 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.873878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wx4k" event={"ID":"e81b43aa-165a-4aaf-bced-4a43a6902437","Type":"ContainerStarted","Data":"dec383e2351df6d58414d360a8a823885c04b51fb0823b04514524023deed63b"} Apr 20 11:42:47.875085 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.875068 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerStarted","Data":"469a38dcddc1766f4b9b2f3fedd6e7911d9f81bee1074600914ff159c10d02c0"} Apr 20 11:42:47.891575 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:47.891525 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kj8zj" podStartSLOduration=8.732526757 podStartE2EDuration="10.891512631s" podCreationTimestamp="2026-04-20 11:42:37 +0000 UTC" firstStartedPulling="2026-04-20 11:42:45.080322078 +0000 UTC m=+40.897584191" lastFinishedPulling="2026-04-20 11:42:47.239307966 +0000 UTC m=+43.056570065" observedRunningTime="2026-04-20 11:42:47.891017187 +0000 UTC m=+43.708279320" watchObservedRunningTime="2026-04-20 11:42:47.891512631 +0000 UTC m=+43.708774803" Apr 20 11:42:49.883707 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:49.883605 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerStarted","Data":"dc0fc13e4e8dfb6d655bbabf9aff005eea6562bf578576970f9b666d77cc0089"} Apr 20 11:42:49.883707 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:49.883647 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerStarted","Data":"f26dc3d085d9a81f7e97b91669e335a2d776dd11cb03291d3827ca1fa77d2c54"} Apr 20 11:42:49.907878 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:49.907823 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8wx4k" podStartSLOduration=10.76719093 podStartE2EDuration="12.907808749s" podCreationTimestamp="2026-04-20 11:42:37 +0000 UTC" firstStartedPulling="2026-04-20 11:42:45.095162513 +0000 UTC m=+40.912424614" lastFinishedPulling="2026-04-20 11:42:47.235780319 +0000 UTC m=+43.053042433" observedRunningTime="2026-04-20 11:42:47.908466166 +0000 UTC m=+43.725728303" watchObservedRunningTime="2026-04-20 11:42:49.907808749 +0000 UTC m=+45.725070940" Apr 20 11:42:49.908051 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:49.907976 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" podStartSLOduration=3.131945775 podStartE2EDuration="12.907953604s" podCreationTimestamp="2026-04-20 11:42:37 +0000 UTC" firstStartedPulling="2026-04-20 11:42:39.773353043 +0000 UTC m=+35.590615143" lastFinishedPulling="2026-04-20 11:42:49.549360865 +0000 UTC m=+45.366622972" observedRunningTime="2026-04-20 11:42:49.9074368 +0000 UTC m=+45.724698921" watchObservedRunningTime="2026-04-20 11:42:49.907953604 +0000 UTC m=+45.725215795" Apr 20 11:42:57.882275 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:57.882243 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kj8zj" Apr 20 11:42:58.004791 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.004732 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" podUID="25c9df4d-8145-4f9d-9f3c-484b02ad0811" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 11:42:58.012473 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.012442 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7fb579659c-9k8zq"] Apr 20 11:42:58.067667 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.067622 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fb579659c-9k8zq"] Apr 20 11:42:58.067853 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.067771 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.071588 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.071564 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 11:42:58.072484 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.072467 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 11:42:58.072641 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.072627 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 11:42:58.073977 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.073936 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jqbrx\"" Apr 20 11:42:58.078902 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.078873 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 11:42:58.192283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192187 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-certificates\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192232 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-trusted-ca\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192258 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-bound-sa-token\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192275 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55dr\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-kube-api-access-f55dr\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192543 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192296 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-image-registry-private-configuration\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192543 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192317 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-installation-pull-secrets\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192543 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192386 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-tls\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.192543 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.192421 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efdd0cc7-cf9f-46a8-b101-da570651a2da-ca-trust-extracted\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.292848 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292807 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efdd0cc7-cf9f-46a8-b101-da570651a2da-ca-trust-extracted\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293057 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292882 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-certificates\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293057 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292899 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-trusted-ca\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293057 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292915 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-bound-sa-token\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293057 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f55dr\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-kube-api-access-f55dr\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293057 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.292951 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-image-registry-private-configuration\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293316 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.293123 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-installation-pull-secrets\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293316 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.293211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-tls\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.293316 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.293274 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/efdd0cc7-cf9f-46a8-b101-da570651a2da-ca-trust-extracted\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.294202 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.294161 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-trusted-ca\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.294325 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.294265 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-certificates\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.295472 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.295450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-image-registry-private-configuration\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.295587 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.295570 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-registry-tls\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.295768 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.295744 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/efdd0cc7-cf9f-46a8-b101-da570651a2da-installation-pull-secrets\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.303448 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.303427 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-bound-sa-token\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.304131 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.304107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55dr\" (UniqueName: \"kubernetes.io/projected/efdd0cc7-cf9f-46a8-b101-da570651a2da-kube-api-access-f55dr\") pod \"image-registry-7fb579659c-9k8zq\" (UID: \"efdd0cc7-cf9f-46a8-b101-da570651a2da\") " pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.378688 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.378645 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.500901 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.500869 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7fb579659c-9k8zq"] Apr 20 11:42:58.504116 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:42:58.504082 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefdd0cc7_cf9f_46a8_b101_da570651a2da.slice/crio-11258c5c549094d57d88bf05f24732f15344b98d3db005b48e9b7bbd957de05e WatchSource:0}: Error finding container 11258c5c549094d57d88bf05f24732f15344b98d3db005b48e9b7bbd957de05e: Status 404 returned error can't find the container with id 11258c5c549094d57d88bf05f24732f15344b98d3db005b48e9b7bbd957de05e Apr 20 11:42:58.907262 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.907227 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" event={"ID":"efdd0cc7-cf9f-46a8-b101-da570651a2da","Type":"ContainerStarted","Data":"2c494ff4dbf63009e96691056b15296f756d109151ccf993018a42894dc75be7"} Apr 20 11:42:58.907262 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.907265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" event={"ID":"efdd0cc7-cf9f-46a8-b101-da570651a2da","Type":"ContainerStarted","Data":"11258c5c549094d57d88bf05f24732f15344b98d3db005b48e9b7bbd957de05e"} Apr 20 11:42:58.907716 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.907372 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:42:58.930894 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:42:58.930838 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" podStartSLOduration=1.9308230690000001 podStartE2EDuration="1.930823069s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:58.930049709 +0000 UTC m=+54.747311846" watchObservedRunningTime="2026-04-20 11:42:58.930823069 +0000 UTC m=+54.748085190" Apr 20 11:43:02.835407 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:02.835380 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h966s" Apr 20 11:43:07.863814 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:07.863766 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj8zj_8ce36e9b-db64-4394-94fd-daf59b2e178f/dns/0.log" Apr 20 11:43:08.002404 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:08.002360 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" podUID="25c9df4d-8145-4f9d-9f3c-484b02ad0811" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 11:43:08.045251 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:08.045218 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj8zj_8ce36e9b-db64-4394-94fd-daf59b2e178f/kube-rbac-proxy/0.log" Apr 20 11:43:08.644871 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:08.644841 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-229kk_216d056e-aca9-4614-b69d-9c50c5631c89/dns-node-resolver/0.log" Apr 20 11:43:09.445518 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:09.445490 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fb579659c-9k8zq_efdd0cc7-cf9f-46a8-b101-da570651a2da/registry/0.log" Apr 20 11:43:10.046411 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.046386 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rr7qt_62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4/node-ca/0.log" Apr 20 11:43:10.375407 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.375327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:43:10.378920 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.378900 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 11:43:10.388542 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.388510 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e519aca4-e5e7-4760-91b9-8488165051df-metrics-certs\") pod \"network-metrics-daemon-2x94v\" (UID: \"e519aca4-e5e7-4760-91b9-8488165051df\") " pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:43:10.475748 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.475710 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:43:10.478846 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.478826 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 11:43:10.489462 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.489443 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 11:43:10.500062 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.500036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72s5\" (UniqueName: \"kubernetes.io/projected/5ba5cc41-47d4-412c-9ba2-40b85e919af7-kube-api-access-c72s5\") pod \"network-check-target-dtzrg\" (UID: \"5ba5cc41-47d4-412c-9ba2-40b85e919af7\") " pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:43:10.534021 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.533997 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:43:10.538822 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.538806 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dmnrq\"" Apr 20 11:43:10.540993 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.540978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2x94v" Apr 20 11:43:10.546665 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.546647 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:43:10.645494 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.645232 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8wx4k_e81b43aa-165a-4aaf-bced-4a43a6902437/serve-healthcheck-canary/0.log" Apr 20 11:43:10.666988 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.666946 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2x94v"] Apr 20 11:43:10.670038 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:43:10.670013 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode519aca4_e5e7_4760_91b9_8488165051df.slice/crio-79e77f97f6be57640db2f5f01a92d6171a9cbc70d7cc47cbcab08bd7d06bb78d WatchSource:0}: Error finding container 79e77f97f6be57640db2f5f01a92d6171a9cbc70d7cc47cbcab08bd7d06bb78d: Status 404 returned error can't find the container with id 79e77f97f6be57640db2f5f01a92d6171a9cbc70d7cc47cbcab08bd7d06bb78d Apr 20 11:43:10.681831 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.681807 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dtzrg"] Apr 20 11:43:10.684318 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:43:10.684295 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba5cc41_47d4_412c_9ba2_40b85e919af7.slice/crio-85516ef205c7bd11d692b081c2751c1771c0fc15b7ea0837cb4110400464dafd WatchSource:0}: Error finding container 85516ef205c7bd11d692b081c2751c1771c0fc15b7ea0837cb4110400464dafd: Status 404 returned error can't find the container with id 85516ef205c7bd11d692b081c2751c1771c0fc15b7ea0837cb4110400464dafd Apr 20 11:43:10.936838 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.936750 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dtzrg" event={"ID":"5ba5cc41-47d4-412c-9ba2-40b85e919af7","Type":"ContainerStarted","Data":"85516ef205c7bd11d692b081c2751c1771c0fc15b7ea0837cb4110400464dafd"} Apr 20 11:43:10.937719 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:10.937697 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x94v" event={"ID":"e519aca4-e5e7-4760-91b9-8488165051df","Type":"ContainerStarted","Data":"79e77f97f6be57640db2f5f01a92d6171a9cbc70d7cc47cbcab08bd7d06bb78d"} Apr 20 11:43:11.941809 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:11.941780 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x94v" event={"ID":"e519aca4-e5e7-4760-91b9-8488165051df","Type":"ContainerStarted","Data":"82d9525f05d9605e02379fc3575ca44bb13c153a49d69733ff06327b01424f8d"} Apr 20 11:43:11.942219 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:11.941819 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2x94v" event={"ID":"e519aca4-e5e7-4760-91b9-8488165051df","Type":"ContainerStarted","Data":"903be98c19e9c6ba0e4fea79d69680d6be5f1b624c82161b3b4ebd01102fcc7d"} Apr 20 11:43:11.962125 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:11.961939 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2x94v" podStartSLOduration=66.98104121 podStartE2EDuration="1m7.961918767s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:43:10.672059497 +0000 UTC m=+66.489321601" lastFinishedPulling="2026-04-20 11:43:11.652937052 +0000 UTC m=+67.470199158" observedRunningTime="2026-04-20 11:43:11.961265886 +0000 UTC m=+67.778528009" watchObservedRunningTime="2026-04-20 11:43:11.961918767 +0000 UTC m=+67.779180889" Apr 20 11:43:13.075980 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.075924 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-t2dhh"] Apr 20 11:43:13.080743 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.080718 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.084362 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.084192 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 11:43:13.084362 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.084286 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 11:43:13.085523 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.085500 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 11:43:13.085523 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.085516 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 11:43:13.085706 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.085509 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rffgp\"" Apr 20 11:43:13.091675 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.091648 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t2dhh"] Apr 20 11:43:13.093697 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.093666 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d551f1-9c87-447e-a857-7977aa061f91-data-volume\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.093804 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.093714 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d551f1-9c87-447e-a857-7977aa061f91-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.093804 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.093744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d551f1-9c87-447e-a857-7977aa061f91-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.093926 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.093799 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwq5\" (UniqueName: \"kubernetes.io/projected/35d551f1-9c87-447e-a857-7977aa061f91-kube-api-access-qnwq5\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.093926 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.093880 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d551f1-9c87-447e-a857-7977aa061f91-crio-socket\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194518 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194484 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d551f1-9c87-447e-a857-7977aa061f91-crio-socket\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194687 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194530 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d551f1-9c87-447e-a857-7977aa061f91-data-volume\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194687 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d551f1-9c87-447e-a857-7977aa061f91-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194687 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194564 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d551f1-9c87-447e-a857-7977aa061f91-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194687 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194583 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwq5\" (UniqueName: \"kubernetes.io/projected/35d551f1-9c87-447e-a857-7977aa061f91-kube-api-access-qnwq5\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194687 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194617 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/35d551f1-9c87-447e-a857-7977aa061f91-crio-socket\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.194950 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.194908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/35d551f1-9c87-447e-a857-7977aa061f91-data-volume\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.195249 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.195218 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/35d551f1-9c87-447e-a857-7977aa061f91-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.197295 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.197260 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/35d551f1-9c87-447e-a857-7977aa061f91-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.205980 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.205926 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwq5\" (UniqueName: \"kubernetes.io/projected/35d551f1-9c87-447e-a857-7977aa061f91-kube-api-access-qnwq5\") pod \"insights-runtime-extractor-t2dhh\" (UID: \"35d551f1-9c87-447e-a857-7977aa061f91\") " pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.393413 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.393252 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-t2dhh" Apr 20 11:43:13.534658 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.534619 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-t2dhh"] Apr 20 11:43:13.539158 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:43:13.539127 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d551f1_9c87_447e_a857_7977aa061f91.slice/crio-36f0c78e85f201afba4dc88bf56c6f160c1146e65570807afd9283dc1a993b44 WatchSource:0}: Error finding container 36f0c78e85f201afba4dc88bf56c6f160c1146e65570807afd9283dc1a993b44: Status 404 returned error can't find the container with id 36f0c78e85f201afba4dc88bf56c6f160c1146e65570807afd9283dc1a993b44 Apr 20 11:43:13.949310 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.949270 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dtzrg" event={"ID":"5ba5cc41-47d4-412c-9ba2-40b85e919af7","Type":"ContainerStarted","Data":"f98472f042406bd83e180a80c3451965205116f6543a906d66a08b9d17735531"} Apr 20 11:43:13.949513 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.949342 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:43:13.950562 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.950541 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t2dhh" event={"ID":"35d551f1-9c87-447e-a857-7977aa061f91","Type":"ContainerStarted","Data":"918fb604bbde937d629a7c7d044385a27fbb89790c1de601f8b96f4bde583984"} Apr 20 11:43:13.950669 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.950566 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t2dhh" event={"ID":"35d551f1-9c87-447e-a857-7977aa061f91","Type":"ContainerStarted","Data":"36f0c78e85f201afba4dc88bf56c6f160c1146e65570807afd9283dc1a993b44"} Apr 20 11:43:13.970511 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:13.970467 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dtzrg" podStartSLOduration=67.190250712 podStartE2EDuration="1m9.970453476s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:43:10.686594451 +0000 UTC m=+66.503856551" lastFinishedPulling="2026-04-20 11:43:13.46679721 +0000 UTC m=+69.284059315" observedRunningTime="2026-04-20 11:43:13.969489778 +0000 UTC m=+69.786751903" watchObservedRunningTime="2026-04-20 11:43:13.970453476 +0000 UTC m=+69.787715598" Apr 20 11:43:14.555974 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.555931 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c4lt9"] Apr 20 11:43:14.559955 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.559931 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mljck"] Apr 20 11:43:14.560099 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.560081 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.562739 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.562716 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 11:43:14.562863 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.562747 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 11:43:14.563029 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.563014 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 11:43:14.563270 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.563244 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.564350 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.564329 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 11:43:14.564350 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.564342 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 11:43:14.564472 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.564373 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-v4znf\"" Apr 20 11:43:14.564472 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.564391 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 11:43:14.565481 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.565462 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 11:43:14.565570 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.565549 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5j4wd\"" Apr 20 11:43:14.565790 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.565775 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 11:43:14.565992 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.565952 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 11:43:14.567287 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.567265 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c4lt9"] Apr 20 11:43:14.602911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.602882 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-wtmp\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.602911 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.602913 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8b076a1e-03de-41b4-91d8-1a988ae01c04-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.602930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-textfile\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.602948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603053 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-metrics-client-ca\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603097 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603090 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603116 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhktm\" (UniqueName: \"kubernetes.io/projected/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-api-access-qhktm\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603147 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603163 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-root\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603190 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdv8\" (UniqueName: \"kubernetes.io/projected/60d22540-31bd-4c88-9d46-0cd1220c36e1-kube-api-access-jqdv8\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603231 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.603282 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.603245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-sys\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.703762 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703734 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.703762 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-sys\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703784 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-wtmp\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703805 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8b076a1e-03de-41b4-91d8-1a988ae01c04-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703820 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-textfile\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703838 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703856 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-sys\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703882 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703921 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-metrics-client-ca\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703946 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:43:14.703951 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.703946 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-wtmp\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704032 ip-10-0-135-68 kubenswrapper[2564]: E0420 11:43:14.704022 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls podName:60d22540-31bd-4c88-9d46-0cd1220c36e1 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:15.204005817 +0000 UTC m=+71.021267916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls") pod "node-exporter-mljck" (UID: "60d22540-31bd-4c88-9d46-0cd1220c36e1") : secret "node-exporter-tls" not found Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704041 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhktm\" (UniqueName: \"kubernetes.io/projected/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-api-access-qhktm\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704076 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-root\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704146 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704165 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdv8\" (UniqueName: \"kubernetes.io/projected/60d22540-31bd-4c88-9d46-0cd1220c36e1-kube-api-access-jqdv8\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704252 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8b076a1e-03de-41b4-91d8-1a988ae01c04-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704554 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704383 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60d22540-31bd-4c88-9d46-0cd1220c36e1-root\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.704875 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704647 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.704875 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.704839 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-textfile\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.705106 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.705084 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b076a1e-03de-41b4-91d8-1a988ae01c04-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.705269 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.705213 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-accelerators-collector-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.705428 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.705393 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60d22540-31bd-4c88-9d46-0cd1220c36e1-metrics-client-ca\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.706647 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.706622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.707030 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.707011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.707377 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.707361 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.714061 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.714035 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhktm\" (UniqueName: \"kubernetes.io/projected/8b076a1e-03de-41b4-91d8-1a988ae01c04-kube-api-access-qhktm\") pod \"kube-state-metrics-69db897b98-c4lt9\" (UID: \"8b076a1e-03de-41b4-91d8-1a988ae01c04\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.714555 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.714534 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdv8\" (UniqueName: \"kubernetes.io/projected/60d22540-31bd-4c88-9d46-0cd1220c36e1-kube-api-access-jqdv8\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:14.871438 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.871359 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" Apr 20 11:43:14.957172 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:14.957120 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t2dhh" event={"ID":"35d551f1-9c87-447e-a857-7977aa061f91","Type":"ContainerStarted","Data":"5da8cf55ad7a0eac52eccaffb087dee759ec124aaf2fc64f1573c3decc2b78e9"} Apr 20 11:43:15.013105 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.013071 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c4lt9"] Apr 20 11:43:15.016235 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:43:15.016203 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b076a1e_03de_41b4_91d8_1a988ae01c04.slice/crio-f2dcd7ecb0396b711ac4e99a87602f1cb36355a6c074aa0bb9116dc0f154d722 WatchSource:0}: Error finding container f2dcd7ecb0396b711ac4e99a87602f1cb36355a6c074aa0bb9116dc0f154d722: Status 404 returned error can't find the container with id f2dcd7ecb0396b711ac4e99a87602f1cb36355a6c074aa0bb9116dc0f154d722 Apr 20 11:43:15.208634 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.208555 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:15.211225 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.211198 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60d22540-31bd-4c88-9d46-0cd1220c36e1-node-exporter-tls\") pod \"node-exporter-mljck\" (UID: \"60d22540-31bd-4c88-9d46-0cd1220c36e1\") " pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:15.476760 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.476723 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mljck" Apr 20 11:43:15.687572 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:43:15.687530 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d22540_31bd_4c88_9d46_0cd1220c36e1.slice/crio-680002330d6b803bbfe91bb1438d8e2edbb8ffcc3fd991713031eca86274a41c WatchSource:0}: Error finding container 680002330d6b803bbfe91bb1438d8e2edbb8ffcc3fd991713031eca86274a41c: Status 404 returned error can't find the container with id 680002330d6b803bbfe91bb1438d8e2edbb8ffcc3fd991713031eca86274a41c Apr 20 11:43:15.962677 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.962265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-t2dhh" event={"ID":"35d551f1-9c87-447e-a857-7977aa061f91","Type":"ContainerStarted","Data":"f5e0f3c42dc41bc58d0539d52adc10def77fd25b293feba0c1dfe4c86ee7718d"} Apr 20 11:43:15.963447 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.963406 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mljck" event={"ID":"60d22540-31bd-4c88-9d46-0cd1220c36e1","Type":"ContainerStarted","Data":"680002330d6b803bbfe91bb1438d8e2edbb8ffcc3fd991713031eca86274a41c"} Apr 20 11:43:15.964503 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.964476 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" event={"ID":"8b076a1e-03de-41b4-91d8-1a988ae01c04","Type":"ContainerStarted","Data":"f2dcd7ecb0396b711ac4e99a87602f1cb36355a6c074aa0bb9116dc0f154d722"} Apr 20 11:43:15.983603 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:15.983490 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-t2dhh" podStartSLOduration=0.83790696 podStartE2EDuration="2.983471197s" podCreationTimestamp="2026-04-20 11:43:13 +0000 UTC" firstStartedPulling="2026-04-20 11:43:13.584618483 +0000 UTC m=+69.401880596" lastFinishedPulling="2026-04-20 11:43:15.73018273 +0000 UTC m=+71.547444833" observedRunningTime="2026-04-20 11:43:15.982498952 +0000 UTC m=+71.799761075" watchObservedRunningTime="2026-04-20 11:43:15.983471197 +0000 UTC m=+71.800733324" Apr 20 11:43:16.968861 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:16.968778 2564 generic.go:358] "Generic (PLEG): container finished" podID="60d22540-31bd-4c88-9d46-0cd1220c36e1" containerID="adfc58bbb0474820ee6166541e1e5ac91becb7b474f9c359c28b528e87b8aae9" exitCode=0 Apr 20 11:43:16.969303 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:16.968861 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mljck" event={"ID":"60d22540-31bd-4c88-9d46-0cd1220c36e1","Type":"ContainerDied","Data":"adfc58bbb0474820ee6166541e1e5ac91becb7b474f9c359c28b528e87b8aae9"} Apr 20 11:43:16.971283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:16.971216 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" event={"ID":"8b076a1e-03de-41b4-91d8-1a988ae01c04","Type":"ContainerStarted","Data":"879c2540585aaf94aa6bd3438ca6dc9608465bb8ee5db2838cd48b6deab796c0"} Apr 20 11:43:16.971283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:16.971255 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" event={"ID":"8b076a1e-03de-41b4-91d8-1a988ae01c04","Type":"ContainerStarted","Data":"c40133b2f2ce30da5e66c61aafc68e4947a1a60cb2ef017f6e1299f9594317f1"} Apr 20 11:43:16.971283 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:16.971279 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" event={"ID":"8b076a1e-03de-41b4-91d8-1a988ae01c04","Type":"ContainerStarted","Data":"ab4a26e7b28d5ec23bcf7cf5782b0b24d1fdfa5965de241156a6e8c28199ca04"} Apr 20 11:43:17.022581 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:17.022530 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-c4lt9" podStartSLOduration=1.339400702 podStartE2EDuration="3.022514282s" podCreationTimestamp="2026-04-20 11:43:14 +0000 UTC" firstStartedPulling="2026-04-20 11:43:15.018443104 +0000 UTC m=+70.835705205" lastFinishedPulling="2026-04-20 11:43:16.701556651 +0000 UTC m=+72.518818785" observedRunningTime="2026-04-20 11:43:17.020864424 +0000 UTC m=+72.838126547" watchObservedRunningTime="2026-04-20 11:43:17.022514282 +0000 UTC m=+72.839776405" Apr 20 11:43:17.978473 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:17.978429 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mljck" event={"ID":"60d22540-31bd-4c88-9d46-0cd1220c36e1","Type":"ContainerStarted","Data":"f52f922a884413fc891b38ef5250a7412eeac0458842a17bdfe2182303cf141c"} Apr 20 11:43:17.978916 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:17.978481 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mljck" event={"ID":"60d22540-31bd-4c88-9d46-0cd1220c36e1","Type":"ContainerStarted","Data":"a6e42a3b93d31d95e6248f623f4b4274d5922d991d995fd9534ce237788d744b"} Apr 20 11:43:18.002784 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.002735 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" podUID="25c9df4d-8145-4f9d-9f3c-484b02ad0811" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 11:43:18.002938 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.002812 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" Apr 20 11:43:18.003394 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.003365 2564 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"dc0fc13e4e8dfb6d655bbabf9aff005eea6562bf578576970f9b666d77cc0089"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 11:43:18.003452 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.003431 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" podUID="25c9df4d-8145-4f9d-9f3c-484b02ad0811" containerName="service-proxy" containerID="cri-o://dc0fc13e4e8dfb6d655bbabf9aff005eea6562bf578576970f9b666d77cc0089" gracePeriod=30 Apr 20 11:43:18.006688 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.006644 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mljck" podStartSLOduration=2.992486054 podStartE2EDuration="4.006629492s" podCreationTimestamp="2026-04-20 11:43:14 +0000 UTC" firstStartedPulling="2026-04-20 11:43:15.689670848 +0000 UTC m=+71.506932951" lastFinishedPulling="2026-04-20 11:43:16.703814278 +0000 UTC m=+72.521076389" observedRunningTime="2026-04-20 11:43:18.004372047 +0000 UTC m=+73.821634169" watchObservedRunningTime="2026-04-20 11:43:18.006629492 +0000 UTC m=+73.823891616" Apr 20 11:43:18.983816 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.983765 2564 generic.go:358] "Generic (PLEG): container finished" podID="25c9df4d-8145-4f9d-9f3c-484b02ad0811" containerID="dc0fc13e4e8dfb6d655bbabf9aff005eea6562bf578576970f9b666d77cc0089" exitCode=2 Apr 20 11:43:18.984707 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.984674 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerDied","Data":"dc0fc13e4e8dfb6d655bbabf9aff005eea6562bf578576970f9b666d77cc0089"} Apr 20 11:43:18.984783 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:18.984722 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-bc559486b-bgmwj" event={"ID":"25c9df4d-8145-4f9d-9f3c-484b02ad0811","Type":"ContainerStarted","Data":"51f708452c42d935ae74878d53f077de715eb29ca23e794b8cd11c96c154964a"} Apr 20 11:43:19.914280 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:19.914254 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7fb579659c-9k8zq" Apr 20 11:43:44.959518 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:43:44.959487 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dtzrg" Apr 20 11:45:08.612231 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.612159 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vkkgb"] Apr 20 11:45:08.615174 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.615155 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.617722 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.617702 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 11:45:08.621629 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.621481 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vkkgb"] Apr 20 11:45:08.714061 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.714035 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-kubelet-config\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.714237 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.714083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-dbus\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.714237 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.714155 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-original-pull-secret\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.814472 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.814440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-dbus\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.814579 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.814484 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-original-pull-secret\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.814579 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.814509 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-kubelet-config\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.814579 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.814574 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-kubelet-config\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.814712 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.814657 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-dbus\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.816777 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.816752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f97751d-5fdc-4cdb-ac82-df1b5951d2fd-original-pull-secret\") pod \"global-pull-secret-syncer-vkkgb\" (UID: \"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd\") " pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:08.924242 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:08.924162 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vkkgb" Apr 20 11:45:09.056604 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:09.056573 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vkkgb"] Apr 20 11:45:09.059953 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:45:09.059924 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f97751d_5fdc_4cdb_ac82_df1b5951d2fd.slice/crio-2ed53a9c5860922d4185193ee2da66a7f4f1ccf48965d2bb371efa3ffb896aa6 WatchSource:0}: Error finding container 2ed53a9c5860922d4185193ee2da66a7f4f1ccf48965d2bb371efa3ffb896aa6: Status 404 returned error can't find the container with id 2ed53a9c5860922d4185193ee2da66a7f4f1ccf48965d2bb371efa3ffb896aa6 Apr 20 11:45:09.260904 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:09.260867 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vkkgb" event={"ID":"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd","Type":"ContainerStarted","Data":"2ed53a9c5860922d4185193ee2da66a7f4f1ccf48965d2bb371efa3ffb896aa6"} Apr 20 11:45:13.272774 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:13.272731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vkkgb" event={"ID":"2f97751d-5fdc-4cdb-ac82-df1b5951d2fd","Type":"ContainerStarted","Data":"915ba5d2027f1f42baba9a13ce45647b4cf703009bdde464f0a89cc68f4d02f9"} Apr 20 11:45:13.292032 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:45:13.291989 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vkkgb" podStartSLOduration=1.6412056210000001 podStartE2EDuration="5.291950146s" podCreationTimestamp="2026-04-20 11:45:08 +0000 UTC" firstStartedPulling="2026-04-20 11:45:09.061420951 +0000 UTC m=+184.878683050" lastFinishedPulling="2026-04-20 11:45:12.712165472 +0000 UTC m=+188.529427575" observedRunningTime="2026-04-20 11:45:13.290637435 +0000 UTC m=+189.107899558" watchObservedRunningTime="2026-04-20 11:45:13.291950146 +0000 UTC m=+189.109212267" Apr 20 11:46:24.278694 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:24.278617 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vkkgb_2f97751d-5fdc-4cdb-ac82-df1b5951d2fd/global-pull-secret-syncer/0.log" Apr 20 11:46:24.420502 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:24.420474 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vs4h8_7c5f2fae-93fd-4644-bbf1-784c37431e5c/konnectivity-agent/0.log" Apr 20 11:46:24.458604 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:24.458551 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-68.ec2.internal_308db96245908b799112e2b3fae81b6e/haproxy/0.log" Apr 20 11:46:27.961424 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:27.961387 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c4lt9_8b076a1e-03de-41b4-91d8-1a988ae01c04/kube-state-metrics/0.log" Apr 20 11:46:27.983256 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:27.983227 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c4lt9_8b076a1e-03de-41b4-91d8-1a988ae01c04/kube-rbac-proxy-main/0.log" Apr 20 11:46:28.008893 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:28.008865 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c4lt9_8b076a1e-03de-41b4-91d8-1a988ae01c04/kube-rbac-proxy-self/0.log" Apr 20 11:46:28.201740 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:28.201704 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mljck_60d22540-31bd-4c88-9d46-0cd1220c36e1/node-exporter/0.log" Apr 20 11:46:28.220753 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:28.220683 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mljck_60d22540-31bd-4c88-9d46-0cd1220c36e1/kube-rbac-proxy/0.log" Apr 20 11:46:28.246071 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:28.246043 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mljck_60d22540-31bd-4c88-9d46-0cd1220c36e1/init-textfile/0.log" Apr 20 11:46:30.597451 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.597418 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb"] Apr 20 11:46:30.600590 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.600566 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.603359 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.603333 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"openshift-service-ca.crt\"" Apr 20 11:46:30.604478 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.604454 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9p767\"/\"default-dockercfg-dcczl\"" Apr 20 11:46:30.604478 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.604468 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"kube-root-ca.crt\"" Apr 20 11:46:30.610786 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.610761 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb"] Apr 20 11:46:30.698453 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.698417 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-sys\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.698453 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.698455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-lib-modules\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.698679 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.698478 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-podres\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.698679 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.698497 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khpfn\" (UniqueName: \"kubernetes.io/projected/1be992a1-cbe6-4025-940b-5ac13a18f69d-kube-api-access-khpfn\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.698679 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.698533 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-proc\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799109 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799070 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-podres\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799109 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799112 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khpfn\" (UniqueName: \"kubernetes.io/projected/1be992a1-cbe6-4025-940b-5ac13a18f69d-kube-api-access-khpfn\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799327 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799200 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-proc\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799327 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799262 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-podres\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799327 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799293 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-sys\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799327 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-lib-modules\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799457 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799353 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-proc\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799457 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799396 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-sys\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.799457 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.799443 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1be992a1-cbe6-4025-940b-5ac13a18f69d-lib-modules\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.809135 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.809097 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khpfn\" (UniqueName: \"kubernetes.io/projected/1be992a1-cbe6-4025-940b-5ac13a18f69d-kube-api-access-khpfn\") pod \"perf-node-gather-daemonset-h7qwb\" (UID: \"1be992a1-cbe6-4025-940b-5ac13a18f69d\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:30.911713 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:30.911604 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:31.033686 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.033653 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb"] Apr 20 11:46:31.036642 ip-10-0-135-68 kubenswrapper[2564]: W0420 11:46:31.036604 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1be992a1_cbe6_4025_940b_5ac13a18f69d.slice/crio-4859948f311090d8118f10d4f36405e2fab24905b5410987851daab7e430a77d WatchSource:0}: Error finding container 4859948f311090d8118f10d4f36405e2fab24905b5410987851daab7e430a77d: Status 404 returned error can't find the container with id 4859948f311090d8118f10d4f36405e2fab24905b5410987851daab7e430a77d Apr 20 11:46:31.464765 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.464730 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" event={"ID":"1be992a1-cbe6-4025-940b-5ac13a18f69d","Type":"ContainerStarted","Data":"18e9bfc0e1f7b636095f7d54b34fdf84fb57682f0d979552a21f854974475038"} Apr 20 11:46:31.464765 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.464767 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" event={"ID":"1be992a1-cbe6-4025-940b-5ac13a18f69d","Type":"ContainerStarted","Data":"4859948f311090d8118f10d4f36405e2fab24905b5410987851daab7e430a77d"} Apr 20 11:46:31.464986 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.464875 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:31.483304 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.483258 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" podStartSLOduration=1.48321977 podStartE2EDuration="1.48321977s" podCreationTimestamp="2026-04-20 11:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:46:31.481805442 +0000 UTC m=+267.299067565" watchObservedRunningTime="2026-04-20 11:46:31.48321977 +0000 UTC m=+267.300481892" Apr 20 11:46:31.754993 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.754925 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj8zj_8ce36e9b-db64-4394-94fd-daf59b2e178f/dns/0.log" Apr 20 11:46:31.779290 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.779257 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kj8zj_8ce36e9b-db64-4394-94fd-daf59b2e178f/kube-rbac-proxy/0.log" Apr 20 11:46:31.866263 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:31.866232 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-229kk_216d056e-aca9-4614-b69d-9c50c5631c89/dns-node-resolver/0.log" Apr 20 11:46:32.319723 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:32.319678 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7fb579659c-9k8zq_efdd0cc7-cf9f-46a8-b101-da570651a2da/registry/0.log" Apr 20 11:46:32.416217 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:32.416186 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rr7qt_62b2a4c2-1ab3-4cf8-8d8d-10675c84e9d4/node-ca/0.log" Apr 20 11:46:33.429568 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:33.429529 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8wx4k_e81b43aa-165a-4aaf-bced-4a43a6902437/serve-healthcheck-canary/0.log" Apr 20 11:46:33.863091 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:33.863063 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t2dhh_35d551f1-9c87-447e-a857-7977aa061f91/kube-rbac-proxy/0.log" Apr 20 11:46:33.882472 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:33.882429 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t2dhh_35d551f1-9c87-447e-a857-7977aa061f91/exporter/0.log" Apr 20 11:46:33.901405 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:33.901371 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-t2dhh_35d551f1-9c87-447e-a857-7977aa061f91/extractor/0.log" Apr 20 11:46:37.477068 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:37.477041 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-h7qwb" Apr 20 11:46:38.732584 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.732555 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/kube-multus-additional-cni-plugins/0.log" Apr 20 11:46:38.750099 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.750081 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/egress-router-binary-copy/0.log" Apr 20 11:46:38.766166 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.766146 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/cni-plugins/0.log" Apr 20 11:46:38.782108 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.782088 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/bond-cni-plugin/0.log" Apr 20 11:46:38.800209 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.800190 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/routeoverride-cni/0.log" Apr 20 11:46:38.818360 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.818342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/whereabouts-cni-bincopy/0.log" Apr 20 11:46:38.836396 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:38.836375 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-494p8_17e25242-c8c6-4d80-a45e-22b47b30d8f0/whereabouts-cni/0.log" Apr 20 11:46:39.021349 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:39.021270 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-clqlc_a7b767da-cf50-4bbd-ac5b-c35078f5cc35/kube-multus/0.log" Apr 20 11:46:39.065480 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:39.065444 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2x94v_e519aca4-e5e7-4760-91b9-8488165051df/network-metrics-daemon/0.log" Apr 20 11:46:39.083134 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:39.083099 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2x94v_e519aca4-e5e7-4760-91b9-8488165051df/kube-rbac-proxy/0.log" Apr 20 11:46:40.268580 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.268549 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-controller/0.log" Apr 20 11:46:40.283310 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.283279 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/0.log" Apr 20 11:46:40.285865 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.285838 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovn-acl-logging/1.log" Apr 20 11:46:40.300566 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.300540 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/kube-rbac-proxy-node/0.log" Apr 20 11:46:40.320864 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.320836 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 11:46:40.337825 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.337743 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/northd/0.log" Apr 20 11:46:40.353553 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.353530 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/nbdb/0.log" Apr 20 11:46:40.370627 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.370602 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/sbdb/0.log" Apr 20 11:46:40.515193 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:40.515155 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h966s_fb6400e5-c705-43eb-867b-42fa8fc8a750/ovnkube-controller/0.log" Apr 20 11:46:41.831821 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:41.831694 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dtzrg_5ba5cc41-47d4-412c-9ba2-40b85e919af7/network-check-target-container/0.log" Apr 20 11:46:42.613793 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:42.613763 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kh22r_a66e511e-e6c2-4ef0-b401-588d90851a5e/iptables-alerter/0.log" Apr 20 11:46:43.239313 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:43.239044 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pjsl6_4ca5c7b9-b35a-4b83-a47b-3463b5058f23/tuned/0.log" Apr 20 11:46:46.208571 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:46.208542 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-flxvk_b0190d50-a6ca-42f6-b113-2acc79ed7c3e/csi-driver/0.log" Apr 20 11:46:46.226500 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:46.226466 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-flxvk_b0190d50-a6ca-42f6-b113-2acc79ed7c3e/csi-node-driver-registrar/0.log" Apr 20 11:46:46.241439 ip-10-0-135-68 kubenswrapper[2564]: I0420 11:46:46.241413 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-flxvk_b0190d50-a6ca-42f6-b113-2acc79ed7c3e/csi-liveness-probe/0.log"