Apr 23 14:54:35.014653 ip-10-0-143-199 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 14:54:35.014664 ip-10-0-143-199 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 14:54:35.014672 ip-10-0-143-199 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 14:54:35.015075 ip-10-0-143-199 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 14:54:45.016398 ip-10-0-143-199 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 14:54:45.016412 ip-10-0-143-199 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7d176209c4924afb86c1a2ce5f7de2a3 -- Apr 23 14:56:52.081402 ip-10-0-143-199 systemd[1]: Starting Kubernetes Kubelet... Apr 23 14:56:52.564592 ip-10-0-143-199 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:56:52.564592 ip-10-0-143-199 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 14:56:52.564592 ip-10-0-143-199 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:56:52.564592 ip-10-0-143-199 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 14:56:52.564592 ip-10-0-143-199 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:56:52.566259 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.566175 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 14:56:52.570826 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570811 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570827 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570831 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570834 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570837 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570840 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570850 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570853 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570855 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570858 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570865 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:56:52.570864 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570868 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570872 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570874 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570877 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570880 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570882 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570885 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570887 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570890 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570892 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570895 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570897 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570900 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570902 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570905 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570908 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570912 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570915 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570918 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:56:52.571147 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570921 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570924 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570926 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570929 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570931 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570934 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570936 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570938 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570941 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570943 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570946 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570948 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570951 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570953 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570955 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570959 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570961 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570964 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570966 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570969 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:56:52.571586 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570971 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570974 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570976 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570985 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570988 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570991 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570994 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.570998 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571000 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571003 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571005 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571008 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571011 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571013 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571016 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571018 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571027 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571030 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571032 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:56:52.572145 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571034 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571037 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571039 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571042 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571044 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571046 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571049 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571053 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571057 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571060 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571063 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571065 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571068 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571070 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571073 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571075 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571078 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571505 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571512 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571515 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:56:52.572594 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571518 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571521 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571523 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571526 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571529 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571532 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571535 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571537 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571539 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571543 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571545 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571548 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571550 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571553 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571555 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571558 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571560 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571562 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571564 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:56:52.573066 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571567 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571571 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571573 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571576 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571579 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571581 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571584 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571586 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571589 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571592 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571595 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571597 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571600 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571602 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571604 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571608 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571610 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571612 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571615 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571617 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:56:52.573522 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571619 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571622 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571624 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571630 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571632 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571636 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571640 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571642 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571645 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571648 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571651 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571653 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571656 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571659 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571662 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571664 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571667 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571670 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571674 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:56:52.574008 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571676 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571679 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571681 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571683 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571687 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571689 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571691 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571694 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571696 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571699 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571701 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571703 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571706 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571708 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571711 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571713 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571715 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571720 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571723 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571725 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:56:52.574521 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571728 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571731 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571733 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571735 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.571738 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571804 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571815 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571822 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571828 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571835 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571840 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571846 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571851 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571855 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571858 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571861 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571864 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571867 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571870 2569 flags.go:64] FLAG: --cgroup-root="" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571873 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571876 2569 flags.go:64] FLAG: --client-ca-file="" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571878 2569 flags.go:64] FLAG: --cloud-config="" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571881 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571884 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 14:56:52.575078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571889 2569 flags.go:64] FLAG: --cluster-domain="" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571892 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571895 2569 flags.go:64] FLAG: --config-dir="" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571897 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571901 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571904 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571911 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571914 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571918 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571921 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571924 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571928 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571931 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571934 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571938 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571941 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571944 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571947 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571950 2569 flags.go:64] FLAG: --enable-server="true" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571953 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571959 2569 flags.go:64] FLAG: --event-burst="100" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571962 2569 flags.go:64] FLAG: --event-qps="50" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571965 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571968 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571971 2569 flags.go:64] FLAG: --eviction-hard="" Apr 23 14:56:52.575670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571975 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571977 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571981 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571984 2569 flags.go:64] FLAG: --eviction-soft="" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571987 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571990 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571992 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571995 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.571998 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572001 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572003 2569 flags.go:64] FLAG: --feature-gates="" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572009 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572013 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572015 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572025 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572029 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572033 2569 flags.go:64] FLAG: --help="false" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572036 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572039 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572042 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572044 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572048 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572051 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572054 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 14:56:52.576258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572057 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572059 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572062 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572065 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572068 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572071 2569 flags.go:64] FLAG: --kube-reserved="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572074 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572077 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572091 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572095 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572111 2569 flags.go:64] FLAG: --lock-file="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572114 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572117 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572120 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572125 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572128 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572131 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572134 2569 flags.go:64] FLAG: --logging-format="text" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572137 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572140 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572143 2569 flags.go:64] FLAG: --manifest-url="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572146 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572150 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572160 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572164 2569 flags.go:64] FLAG: --max-pods="110" Apr 23 14:56:52.576822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572167 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572170 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572173 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572176 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572179 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572182 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572185 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572192 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572195 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572198 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572201 2569 flags.go:64] FLAG: --pod-cidr="" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572204 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572210 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572212 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572215 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572218 2569 flags.go:64] FLAG: --port="10250" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572222 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572224 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08c9e378f44b27359" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572227 2569 flags.go:64] FLAG: --qos-reserved="" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572230 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572233 2569 flags.go:64] FLAG: --register-node="true" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572242 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572245 2569 flags.go:64] FLAG: --register-with-taints="" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572248 2569 flags.go:64] FLAG: --registry-burst="10" Apr 23 14:56:52.577425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572251 2569 flags.go:64] FLAG: --registry-qps="5" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572254 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572257 2569 flags.go:64] FLAG: --reserved-memory="" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572260 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572263 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572266 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572269 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572275 2569 flags.go:64] FLAG: --runonce="false" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572279 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572282 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572285 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572288 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572290 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572293 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572297 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572300 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572302 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572305 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572308 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572311 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572314 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572317 2569 flags.go:64] FLAG: --system-cgroups="" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572320 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572325 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572328 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 23 14:56:52.577988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572330 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572337 2569 flags.go:64] FLAG: --tls-min-version="" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572340 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572343 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572345 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572348 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572351 2569 flags.go:64] FLAG: --v="2" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572355 2569 flags.go:64] FLAG: --version="false" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572359 2569 flags.go:64] FLAG: --vmodule="" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572363 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.572366 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572478 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572482 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572486 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572492 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572496 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572499 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572502 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572505 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572508 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572511 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572514 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:56:52.578602 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572516 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572519 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572521 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572524 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572528 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572531 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572534 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572537 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572540 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572542 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572545 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572547 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572550 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572552 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572557 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572560 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572564 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572567 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572569 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572572 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:56:52.579171 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572574 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572577 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572579 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572582 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572585 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572588 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572596 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572598 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572601 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572604 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572607 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572609 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572612 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572615 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572618 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572621 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572623 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572626 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572628 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:56:52.579659 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572631 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572633 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572636 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572638 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572641 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572643 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572645 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572648 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572650 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572654 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572656 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572659 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572661 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572664 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572666 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572669 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572671 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572675 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572678 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572680 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:56:52.580168 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572688 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572691 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572693 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572696 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572698 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572701 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572703 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572705 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572708 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572710 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572713 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572715 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572718 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572720 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572722 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:56:52.580639 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.572725 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.573785 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.580299 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.580401 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580451 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580456 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580460 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580463 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580466 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580468 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580471 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580474 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580476 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580479 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580481 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:56:52.580994 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580484 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580488 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580493 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580497 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580499 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580502 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580505 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580507 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580510 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580512 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580515 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580517 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580520 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580523 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580526 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580528 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580530 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580533 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580535 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580538 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:56:52.581382 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580541 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580543 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580546 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580548 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580551 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580554 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580557 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580560 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580563 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580566 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580568 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580570 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580573 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580575 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580578 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580581 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580583 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580586 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580588 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580591 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:56:52.581859 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580593 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580596 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580598 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580600 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580603 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580605 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580608 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580610 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580613 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580615 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580618 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580621 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580623 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580626 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580628 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580631 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580633 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580635 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580638 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580640 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:56:52.582355 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580643 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580646 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580648 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580651 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580653 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580656 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580659 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580661 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580664 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580666 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580668 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580671 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580674 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580676 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580679 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:56:52.582834 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.580683 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580775 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580779 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580782 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580785 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580787 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580790 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580792 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580795 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580798 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580800 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580803 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580806 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580808 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580811 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580813 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580816 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580819 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580821 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580823 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:56:52.583348 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580826 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580828 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580831 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580833 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580835 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580838 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580842 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580844 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580847 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580849 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580853 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580856 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580859 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580862 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580864 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580867 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580869 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580871 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580874 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580876 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:56:52.583832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580878 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580881 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580884 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580887 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580889 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580892 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580894 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580896 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580899 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580902 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580904 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580906 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580910 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580913 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580916 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580919 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580921 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580923 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580925 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:56:52.584329 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580928 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580931 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580933 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580936 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580938 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580941 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580943 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580945 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580948 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580950 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580953 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580955 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580957 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580960 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580962 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580965 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580967 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580970 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580972 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580974 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:56:52.584767 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580977 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580979 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580982 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580984 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580986 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580989 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580991 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:52.580993 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.580998 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.581749 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 14:56:52.585261 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.585243 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 14:56:52.586173 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.586162 2569 server.go:1019] "Starting client certificate rotation" Apr 23 14:56:52.586290 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.586270 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 14:56:52.586350 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.586328 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 14:56:52.614791 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.614763 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 14:56:52.619461 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.619436 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 14:56:52.633370 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.633347 2569 log.go:25] "Validated CRI v1 runtime API" Apr 23 14:56:52.638764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.638746 2569 log.go:25] "Validated CRI v1 image API" Apr 23 14:56:52.639894 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.639879 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 14:56:52.645144 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.645122 2569 fs.go:135] Filesystem UUIDs: map[5389c113-0656-4884-8e4a-8f8920358308:/dev/nvme0n1p4 6feaee74-ec44-4c71-bc09-83910331c3d2:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 23 14:56:52.645227 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.645143 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 14:56:52.650734 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.650716 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 14:56:52.651230 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.651114 2569 manager.go:217] Machine: {Timestamp:2026-04-23 14:56:52.648781635 +0000 UTC m=+0.442134791 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199668 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2775b2b8384651c943a66a2e84ea33 SystemUUID:ec2775b2-b838-4651-c943-a66a2e84ea33 BootID:7d176209-c492-4afb-86c1-a2ce5f7de2a3 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c1:b9:d2:07:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c1:b9:d2:07:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:a5:c3:23:03:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 14:56:52.651230 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.651225 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 14:56:52.651378 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.651344 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 14:56:52.652301 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652273 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 14:56:52.652456 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652303 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-199.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 14:56:52.652527 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652470 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 14:56:52.652527 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652483 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 14:56:52.652527 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652502 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 14:56:52.652527 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.652521 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 14:56:52.653822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.653808 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 14:56:52.653949 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.653938 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 14:56:52.657162 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.657094 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 23 14:56:52.657228 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.657170 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 14:56:52.657228 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.657188 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 14:56:52.657228 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.657205 2569 kubelet.go:397] "Adding apiserver pod source" Apr 23 14:56:52.657228 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.657219 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 14:56:52.658164 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.658152 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 14:56:52.658234 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.658176 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 14:56:52.661639 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.661621 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 14:56:52.666307 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.666283 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 14:56:52.667653 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667639 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 14:56:52.667653 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667658 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667666 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667697 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667714 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667722 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667729 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667736 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667745 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 14:56:52.667764 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667754 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 14:56:52.667969 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667772 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 14:56:52.667969 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.667786 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 14:56:52.668679 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.668667 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 14:56:52.668720 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.668689 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 14:56:52.671544 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.671516 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 14:56:52.671544 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.671540 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-199.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 14:56:52.671655 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.671564 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 14:56:52.672211 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.672197 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 14:56:52.672263 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.672236 2569 server.go:1295] "Started kubelet" Apr 23 14:56:52.672365 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.672316 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 14:56:52.672425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.672363 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 14:56:52.672425 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.672420 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 14:56:52.673150 ip-10-0-143-199 systemd[1]: Started Kubernetes Kubelet. Apr 23 14:56:52.673275 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.673197 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 23 14:56:52.675926 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.675910 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 14:56:52.678840 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.677702 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-199.ec2.internal.18a90442bdc02aa0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-199.ec2.internal,UID:ip-10-0-143-199.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-199.ec2.internal,},FirstTimestamp:2026-04-23 14:56:52.672211616 +0000 UTC m=+0.465564776,LastTimestamp:2026-04-23 14:56:52.672211616 +0000 UTC m=+0.465564776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-199.ec2.internal,}" Apr 23 14:56:52.682256 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.682237 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 14:56:52.682789 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.682773 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 14:56:52.683473 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.683445 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 14:56:52.683473 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.683447 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 14:56:52.683473 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.683475 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 14:56:52.683677 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.683537 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 14:56:52.683677 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.683605 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 23 14:56:52.683677 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.683612 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 23 14:56:52.683677 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.683651 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:52.684186 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684172 2569 factory.go:55] Registering systemd factory Apr 23 14:56:52.684269 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684219 2569 factory.go:223] Registration of the systemd container factory successfully Apr 23 14:56:52.684400 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684389 2569 factory.go:153] Registering CRI-O factory Apr 23 14:56:52.684400 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684400 2569 factory.go:223] Registration of the crio container factory successfully Apr 23 14:56:52.684501 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684443 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 14:56:52.684501 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684461 2569 factory.go:103] Registering Raw factory Apr 23 14:56:52.684501 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684471 2569 manager.go:1196] Started watching for new ooms in manager Apr 23 14:56:52.684886 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.684873 2569 manager.go:319] Starting recovery of all containers Apr 23 14:56:52.690552 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.690527 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 14:56:52.693647 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.693629 2569 manager.go:324] Recovery completed Apr 23 14:56:52.694489 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.694465 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nwjh7" Apr 23 14:56:52.695492 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.695379 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 14:56:52.697646 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.697617 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-199.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 14:56:52.699884 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.699870 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.702281 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702265 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.702346 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702300 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.702346 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702316 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.702516 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702496 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nwjh7" Apr 23 14:56:52.702883 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702867 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 14:56:52.702883 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702884 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 14:56:52.702991 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.702902 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 23 14:56:52.705180 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.705166 2569 policy_none.go:49] "None policy: Start" Apr 23 14:56:52.705241 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.705186 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 14:56:52.705241 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.705200 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 23 14:56:52.750825 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.750809 2569 manager.go:341] "Starting Device Plugin manager" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.750864 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.750874 2569 server.go:85] "Starting device plugin registration server" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.751139 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.751153 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.751254 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.751325 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.751332 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.752090 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.752143 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.762249 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.762276 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.762292 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.762298 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.762330 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 14:56:52.772078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.766607 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:56:52.852052 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.851979 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.853198 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.853182 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.853262 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.853213 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.853262 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.853225 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.853262 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.853248 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.863147 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.863125 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal"] Apr 23 14:56:52.863222 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.863207 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.863918 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.863898 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.864027 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.863923 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-199.ec2.internal\": node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:52.864118 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.864091 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.864184 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.864133 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.864184 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.864147 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.866263 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.866249 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.866413 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.866398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.866459 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.866429 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.867140 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867126 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.867192 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867132 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.867192 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867169 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.867192 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867179 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.867281 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867153 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.867281 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.867219 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.869188 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.869175 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.869246 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.869200 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:56:52.869959 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.869945 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:56:52.870006 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.869974 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:56:52.870006 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.869985 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:56:52.884906 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.884890 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:52.894758 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.894745 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-199.ec2.internal\" not found" node="ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.898999 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.898984 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-199.ec2.internal\" not found" node="ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.985007 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:52.984980 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:52.985203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.985048 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.985203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.985072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:52.985203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:52.985089 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bff84cbb0dcf4f847d21e0c35b203751-config\") pod \"kube-apiserver-proxy-ip-10-0-143-199.ec2.internal\" (UID: \"bff84cbb0dcf4f847d21e0c35b203751\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.085847 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.085820 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.085924 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.085889 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.085924 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.085915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bff84cbb0dcf4f847d21e0c35b203751-config\") pod \"kube-apiserver-proxy-ip-10-0-143-199.ec2.internal\" (UID: \"bff84cbb0dcf4f847d21e0c35b203751\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.086012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.085932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.086012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.085987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.086092 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.086012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bff84cbb0dcf4f847d21e0c35b203751-config\") pod \"kube-apiserver-proxy-ip-10-0-143-199.ec2.internal\" (UID: \"bff84cbb0dcf4f847d21e0c35b203751\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.086092 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.085990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1dd5991561291eb60cf7712bf981919b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal\" (UID: \"1dd5991561291eb60cf7712bf981919b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.186554 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.186479 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.197662 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.197640 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.201083 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.201063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:53.286685 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.286651 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.387024 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.386991 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.487418 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.487343 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.586862 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.586835 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 14:56:53.587522 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.587003 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 14:56:53.587919 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.587892 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.648272 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.648250 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:56:53.682878 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.682853 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 14:56:53.688094 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.688073 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.703831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.703794 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 14:51:52 +0000 UTC" deadline="2027-10-17 15:39:44.450287161 +0000 UTC" Apr 23 14:56:53.703831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.703826 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13008h42m50.746463935s" Apr 23 14:56:53.714479 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.714461 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 14:56:53.717749 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:53.717726 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff84cbb0dcf4f847d21e0c35b203751.slice/crio-eebae69c5b14e4cee14f30143a46f12c6fbb9a3b169ecf8d640825a12270abff WatchSource:0}: Error finding container eebae69c5b14e4cee14f30143a46f12c6fbb9a3b169ecf8d640825a12270abff: Status 404 returned error can't find the container with id eebae69c5b14e4cee14f30143a46f12c6fbb9a3b169ecf8d640825a12270abff Apr 23 14:56:53.718251 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:53.718233 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd5991561291eb60cf7712bf981919b.slice/crio-1940e2db81df1fdcdc6f0a496d80be2cd3cf839c161bc76f619b3ed5a4f3f6f5 WatchSource:0}: Error finding container 1940e2db81df1fdcdc6f0a496d80be2cd3cf839c161bc76f619b3ed5a4f3f6f5: Status 404 returned error can't find the container with id 1940e2db81df1fdcdc6f0a496d80be2cd3cf839c161bc76f619b3ed5a4f3f6f5 Apr 23 14:56:53.722636 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.722622 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:56:53.734461 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.734441 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hxqbh" Apr 23 14:56:53.743981 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.743936 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hxqbh" Apr 23 14:56:53.765594 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.765553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" event={"ID":"1dd5991561291eb60cf7712bf981919b","Type":"ContainerStarted","Data":"1940e2db81df1fdcdc6f0a496d80be2cd3cf839c161bc76f619b3ed5a4f3f6f5"} Apr 23 14:56:53.766482 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.766464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" event={"ID":"bff84cbb0dcf4f847d21e0c35b203751","Type":"ContainerStarted","Data":"eebae69c5b14e4cee14f30143a46f12c6fbb9a3b169ecf8d640825a12270abff"} Apr 23 14:56:53.788811 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.788790 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.889189 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.889163 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.989596 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:53.989549 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:53.996698 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:53.996627 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:56:54.090337 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.090298 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:54.190922 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.190886 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-199.ec2.internal\" not found" Apr 23 14:56:54.210068 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.210036 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:56:54.283451 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.283370 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" Apr 23 14:56:54.298930 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.298894 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 14:56:54.299837 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.299816 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" Apr 23 14:56:54.309824 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.309806 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 14:56:54.657866 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.657789 2569 apiserver.go:52] "Watching apiserver" Apr 23 14:56:54.667015 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.666984 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 14:56:54.667418 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.667395 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-x9gsg","openshift-network-operator/iptables-alerter-84b9q","openshift-ovn-kubernetes/ovnkube-node-wgqdj","kube-system/konnectivity-agent-h6s52","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd","openshift-cluster-node-tuning-operator/tuned-7bjrs","openshift-image-registry/node-ca-bvjj4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal","openshift-multus/multus-additional-cni-plugins-jnxqk","openshift-multus/multus-s4q5x","openshift-network-diagnostics/network-check-target-6w94x","kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal"] Apr 23 14:56:54.672474 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.672448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.674675 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.674650 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.675426 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.675408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 14:56:54.675645 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.675630 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 14:56:54.675814 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.675796 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4dt8n\"" Apr 23 14:56:54.676788 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.676770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.676884 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.676842 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.676884 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.676850 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:56:54.678233 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.678215 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pwgwd\"" Apr 23 14:56:54.678418 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.678401 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.679182 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.678957 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.679447 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.679283 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 14:56:54.679447 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.679396 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.679789 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.679772 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.679867 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.679812 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-26p67\"" Apr 23 14:56:54.681372 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.681352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.681469 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.681451 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.683700 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.683682 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.686396 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.686368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.689162 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689138 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.689316 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689296 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 14:56:54.689514 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689497 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zvrrj\"" Apr 23 14:56:54.689584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689545 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.689834 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689814 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.689898 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689853 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6f2v7\"" Apr 23 14:56:54.689977 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689824 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 14:56:54.690078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689998 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 14:56:54.690159 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.689315 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.690159 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690135 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4qssr\"" Apr 23 14:56:54.690629 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690400 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 14:56:54.690629 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690488 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 14:56:54.690629 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690616 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.690849 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690686 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 14:56:54.690849 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690745 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 14:56:54.690849 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690803 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.690849 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690816 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 14:56:54.691044 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690933 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 14:56:54.691044 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.690995 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pv5wb\"" Apr 23 14:56:54.691044 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.691001 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 14:56:54.691257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.691170 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 14:56:54.691564 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.691473 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.693439 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-sys\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.693525 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693453 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-script-lib\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693525 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693484 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693525 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693659 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-env-overrides\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693659 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.693749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvt52\" (UniqueName: \"kubernetes.io/projected/25a96c44-032a-41ac-8ed7-c051e3666b8c-kube-api-access-zvt52\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.693749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693685 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4br\" (UniqueName: \"kubernetes.io/projected/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-kube-api-access-lm4br\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-var-lib-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-etc-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693758 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh5gv\" (UniqueName: \"kubernetes.io/projected/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-kube-api-access-mh5gv\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-systemd\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/838ed924-f431-4482-a37d-5836e1570c45-konnectivity-ca\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-netd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-host\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693922 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-lib-modules\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-tmp\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.693996 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.693984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-systemd-units\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-systemd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-log-socket\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-bin\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-slash\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-tuned\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8849fada-dac8-4194-8755-b18e59197a97-host-slash\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-device-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-kubelet\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694351 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:54.694374 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-ovn\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-os-release\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.694407 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-modprobe-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/838ed924-f431-4482-a37d-5836e1570c45-agent-certs\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-serviceca\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694556 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-node-log\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-var-lib-kubelet\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7hh\" (UniqueName: \"kubernetes.io/projected/5ec98e0e-ddcd-4e32-9712-870091ed1acb-kube-api-access-qc7hh\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-netns\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-config\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694739 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-cnibin\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694743 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysconfig\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-conf\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.694932 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694887 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l27sc\" (UniqueName: \"kubernetes.io/projected/8849fada-dac8-4194-8755-b18e59197a97-kube-api-access-l27sc\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694895 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ndglh\"" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694948 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.694972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8849fada-dac8-4194-8755-b18e59197a97-iptables-alerter-script\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-host\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-run\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxxw\" (UniqueName: \"kubernetes.io/projected/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kube-api-access-bnxxw\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695095 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-kubernetes\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovn-node-metrics-cert\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695162 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-system-cni-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.695584 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.695177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjm26\" (UniqueName: \"kubernetes.io/projected/36b18c47-1676-4fda-b4e6-a7a9acee20a9-kube-api-access-qjm26\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.744727 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.744684 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 14:51:53 +0000 UTC" deadline="2027-12-02 20:18:09.022572212 +0000 UTC" Apr 23 14:56:54.744727 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.744717 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14117h21m14.277858949s" Apr 23 14:56:54.784715 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.784691 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 14:56:54.795988 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.795962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.795994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-device-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-socket-dir-parent\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-multus\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-kubelet\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-ovn\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.796152 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-device-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796175 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-ovn\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-kubelet\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-os-release\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-modprobe-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796321 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-os-release\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-system-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-modprobe-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cni-binary-copy\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796456 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/838ed924-f431-4482-a37d-5836e1570c45-agent-certs\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-serviceca\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-kubelet\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-etc-kubernetes\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-node-log\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-var-lib-kubelet\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7hh\" (UniqueName: \"kubernetes.io/projected/5ec98e0e-ddcd-4e32-9712-870091ed1acb-kube-api-access-qc7hh\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.796725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796696 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-netns\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-config\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-cnibin\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796805 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysconfig\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-conf\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-multus-certs\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l27sc\" (UniqueName: \"kubernetes.io/projected/8849fada-dac8-4194-8755-b18e59197a97-kube-api-access-l27sc\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796903 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.796979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-bin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-serviceca\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8849fada-dac8-4194-8755-b18e59197a97-iptables-alerter-script\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-host\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-run\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxxw\" (UniqueName: \"kubernetes.io/projected/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kube-api-access-bnxxw\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.797174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-netns\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-sys-fs\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-hostroot\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797198 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-conf-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797256 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-kubernetes\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovn-node-metrics-cert\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-system-cni-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjm26\" (UniqueName: \"kubernetes.io/projected/36b18c47-1676-4fda-b4e6-a7a9acee20a9-kube-api-access-qjm26\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797365 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbhb\" (UniqueName: \"kubernetes.io/projected/b6751e3c-d6ad-40a0-9acc-87ab97b33923-kube-api-access-tcbhb\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797414 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-sys\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-var-lib-kubelet\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797475 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-system-cni-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysconfig\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-cnibin\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797617 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-conf\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.797865 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8849fada-dac8-4194-8755-b18e59197a97-iptables-alerter-script\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-netns\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797686 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-sysctl-d\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-registration-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-node-log\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-host\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-kubernetes\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-script-lib\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-env-overrides\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-sys\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.797969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25a96c44-032a-41ac-8ed7-c051e3666b8c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798192 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvt52\" (UniqueName: \"kubernetes.io/projected/25a96c44-032a-41ac-8ed7-c051e3666b8c-kube-api-access-zvt52\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.798692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-config\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4br\" (UniqueName: \"kubernetes.io/projected/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-kube-api-access-lm4br\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-env-overrides\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-os-release\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-k8s-cni-cncf-io\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798450 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovnkube-script-lib\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-daemon-config\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-run\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-etc-selinux\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-var-lib-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-etc-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh5gv\" (UniqueName: \"kubernetes.io/projected/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-kube-api-access-mh5gv\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-systemd\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798734 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-var-lib-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-etc-openvswitch\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.799624 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-systemd\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798801 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cnibin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/838ed924-f431-4482-a37d-5836e1570c45-konnectivity-ca\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-netd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-host\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-netd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-host\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.798963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-lib-modules\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-tmp\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-systemd-units\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799088 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-systemd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799096 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5ec98e0e-ddcd-4e32-9712-870091ed1acb-lib-modules\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-log-socket\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-run-systemd\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-bin\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.800392 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799253 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-log-socket\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-systemd-units\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-slash\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-cni-bin\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-tuned\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.799328 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799370 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8849fada-dac8-4194-8755-b18e59197a97-host-slash\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.799402 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:55.29937922 +0000 UTC m=+3.092732384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/838ed924-f431-4482-a37d-5836e1570c45-konnectivity-ca\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799451 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8849fada-dac8-4194-8755-b18e59197a97-host-slash\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-run-ovn-kubernetes\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-host-slash\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-socket-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.801203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.799766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25a96c44-032a-41ac-8ed7-c051e3666b8c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.802000 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.801712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/838ed924-f431-4482-a37d-5836e1570c45-agent-certs\") pod \"konnectivity-agent-h6s52\" (UID: \"838ed924-f431-4482-a37d-5836e1570c45\") " pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.802000 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.801834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-ovn-node-metrics-cert\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.802000 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.801936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-tmp\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.804569 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.804359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5ec98e0e-ddcd-4e32-9712-870091ed1acb-etc-tuned\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.807010 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.806990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l27sc\" (UniqueName: \"kubernetes.io/projected/8849fada-dac8-4194-8755-b18e59197a97-kube-api-access-l27sc\") pod \"iptables-alerter-84b9q\" (UID: \"8849fada-dac8-4194-8755-b18e59197a97\") " pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:54.807258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.807242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxxw\" (UniqueName: \"kubernetes.io/projected/2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f-kube-api-access-bnxxw\") pod \"aws-ebs-csi-driver-node-9wnzd\" (UID: \"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:54.808222 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.808196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4br\" (UniqueName: \"kubernetes.io/projected/a2c6ef5a-00fe-42a6-a297-8b67bf27ea78-kube-api-access-lm4br\") pod \"ovnkube-node-wgqdj\" (UID: \"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78\") " pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:54.811752 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.811730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjm26\" (UniqueName: \"kubernetes.io/projected/36b18c47-1676-4fda-b4e6-a7a9acee20a9-kube-api-access-qjm26\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:54.812027 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.812000 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh5gv\" (UniqueName: \"kubernetes.io/projected/2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8-kube-api-access-mh5gv\") pod \"node-ca-bvjj4\" (UID: \"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8\") " pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:54.812196 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.812177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7hh\" (UniqueName: \"kubernetes.io/projected/5ec98e0e-ddcd-4e32-9712-870091ed1acb-kube-api-access-qc7hh\") pod \"tuned-7bjrs\" (UID: \"5ec98e0e-ddcd-4e32-9712-870091ed1acb\") " pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.812373 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.812358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvt52\" (UniqueName: \"kubernetes.io/projected/25a96c44-032a-41ac-8ed7-c051e3666b8c-kube-api-access-zvt52\") pod \"multus-additional-cni-plugins-jnxqk\" (UID: \"25a96c44-032a-41ac-8ed7-c051e3666b8c\") " pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:54.900510 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-os-release\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-k8s-cni-cncf-io\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-k8s-cni-cncf-io\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-daemon-config\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cnibin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900682 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-os-release\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-socket-dir-parent\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cnibin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-multus\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-multus\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-system-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-socket-dir-parent\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cni-binary-copy\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-system-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-kubelet\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.900912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-kubelet\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900924 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-etc-kubernetes\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-multus-certs\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-bin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.900997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-etc-kubernetes\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-netns\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-multus-certs\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-var-lib-cni-bin\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-host-run-netns\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-hostroot\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-conf-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbhb\" (UniqueName: \"kubernetes.io/projected/b6751e3c-d6ad-40a0-9acc-87ab97b33923-kube-api-access-tcbhb\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901142 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-hostroot\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901149 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-conf-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-cni-dir\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-multus-daemon-config\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.901419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b6751e3c-d6ad-40a0-9acc-87ab97b33923-cni-binary-copy\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.902023 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.901656 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:56:54.907321 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.907299 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:56:54.907321 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.907325 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:56:54.907478 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.907338 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:54.907478 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:54.907398 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:55.407383491 +0000 UTC m=+3.200736650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:54.910195 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.910118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbhb\" (UniqueName: \"kubernetes.io/projected/b6751e3c-d6ad-40a0-9acc-87ab97b33923-kube-api-access-tcbhb\") pod \"multus-s4q5x\" (UID: \"b6751e3c-d6ad-40a0-9acc-87ab97b33923\") " pod="openshift-multus/multus-s4q5x" Apr 23 14:56:54.982600 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.982562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:56:54.989521 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.989498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" Apr 23 14:56:54.999202 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:54.999185 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84b9q" Apr 23 14:56:55.004888 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.004868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:56:55.010454 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.010438 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" Apr 23 14:56:55.017923 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.017897 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bvjj4" Apr 23 14:56:55.023448 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.023426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" Apr 23 14:56:55.028950 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.028930 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4q5x" Apr 23 14:56:55.300776 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.300744 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6751e3c_d6ad_40a0_9acc_87ab97b33923.slice/crio-2185d095f45f979791e891e0c4a27dc7f0d46eef2d162ea2935198123417bec4 WatchSource:0}: Error finding container 2185d095f45f979791e891e0c4a27dc7f0d46eef2d162ea2935198123417bec4: Status 404 returned error can't find the container with id 2185d095f45f979791e891e0c4a27dc7f0d46eef2d162ea2935198123417bec4 Apr 23 14:56:55.303473 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.303443 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8849fada_dac8_4194_8755_b18e59197a97.slice/crio-4cefb5af2b87132af75400f45869dbdddc545effd5fbb48f540fa32dc4e54ac1 WatchSource:0}: Error finding container 4cefb5af2b87132af75400f45869dbdddc545effd5fbb48f540fa32dc4e54ac1: Status 404 returned error can't find the container with id 4cefb5af2b87132af75400f45869dbdddc545effd5fbb48f540fa32dc4e54ac1 Apr 23 14:56:55.303943 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.303914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:55.304066 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.304021 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:55.304126 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.304073 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:56.304049912 +0000 UTC m=+4.097403059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:55.305679 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.305654 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec98e0e_ddcd_4e32_9712_870091ed1acb.slice/crio-42373b1a9d34eaa69cbdb94156addffe95b6b14a5268f1bfaaaafaa999de2fe0 WatchSource:0}: Error finding container 42373b1a9d34eaa69cbdb94156addffe95b6b14a5268f1bfaaaafaa999de2fe0: Status 404 returned error can't find the container with id 42373b1a9d34eaa69cbdb94156addffe95b6b14a5268f1bfaaaafaa999de2fe0 Apr 23 14:56:55.307680 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.307560 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c6ef5a_00fe_42a6_a297_8b67bf27ea78.slice/crio-e0e1efb00ac88b71f7b92ebc9ab7cd1570d11ce57de65178f251fbee78711fd0 WatchSource:0}: Error finding container e0e1efb00ac88b71f7b92ebc9ab7cd1570d11ce57de65178f251fbee78711fd0: Status 404 returned error can't find the container with id e0e1efb00ac88b71f7b92ebc9ab7cd1570d11ce57de65178f251fbee78711fd0 Apr 23 14:56:55.308722 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.308703 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e7dfccb_2c1b_4a01_a837_596ab4bc2e8f.slice/crio-a987dd23a825e45c129877e7e2648513c56e3b1b7bbc58048eb179ee2577b00d WatchSource:0}: Error finding container a987dd23a825e45c129877e7e2648513c56e3b1b7bbc58048eb179ee2577b00d: Status 404 returned error can't find the container with id a987dd23a825e45c129877e7e2648513c56e3b1b7bbc58048eb179ee2577b00d Apr 23 14:56:55.309832 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.309814 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838ed924_f431_4482_a37d_5836e1570c45.slice/crio-8bcc08d67c76aa17d0f3ef16b1aa75ec52b513823a5fc7f233165cadc7b6303b WatchSource:0}: Error finding container 8bcc08d67c76aa17d0f3ef16b1aa75ec52b513823a5fc7f233165cadc7b6303b: Status 404 returned error can't find the container with id 8bcc08d67c76aa17d0f3ef16b1aa75ec52b513823a5fc7f233165cadc7b6303b Apr 23 14:56:55.331778 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.331748 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a96c44_032a_41ac_8ed7_c051e3666b8c.slice/crio-77907fe316cdc84cda6e14b86b5119f38f017c5304a24a6b1409acfaf4928076 WatchSource:0}: Error finding container 77907fe316cdc84cda6e14b86b5119f38f017c5304a24a6b1409acfaf4928076: Status 404 returned error can't find the container with id 77907fe316cdc84cda6e14b86b5119f38f017c5304a24a6b1409acfaf4928076 Apr 23 14:56:55.332511 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:56:55.332493 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6354ca_7ee1_46d6_b36a_0a9af17e4cc8.slice/crio-d54a4da9df1f2a3755fcdb9017ede6dbc0221a8fb88395824528ce409bf1c515 WatchSource:0}: Error finding container d54a4da9df1f2a3755fcdb9017ede6dbc0221a8fb88395824528ce409bf1c515: Status 404 returned error can't find the container with id d54a4da9df1f2a3755fcdb9017ede6dbc0221a8fb88395824528ce409bf1c515 Apr 23 14:56:55.506275 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.506239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:55.506472 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.506454 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:56:55.506525 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.506483 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:56:55.506525 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.506497 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:55.506588 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:55.506578 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:56.506555588 +0000 UTC m=+4.299908748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:55.745572 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.745483 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 14:51:53 +0000 UTC" deadline="2027-10-11 17:54:16.92808477 +0000 UTC" Apr 23 14:56:55.745572 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.745520 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12866h57m21.182568699s" Apr 23 14:56:55.771979 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.771902 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" event={"ID":"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f","Type":"ContainerStarted","Data":"a987dd23a825e45c129877e7e2648513c56e3b1b7bbc58048eb179ee2577b00d"} Apr 23 14:56:55.777015 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.776971 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" event={"ID":"5ec98e0e-ddcd-4e32-9712-870091ed1acb","Type":"ContainerStarted","Data":"42373b1a9d34eaa69cbdb94156addffe95b6b14a5268f1bfaaaafaa999de2fe0"} Apr 23 14:56:55.782452 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.782415 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84b9q" event={"ID":"8849fada-dac8-4194-8755-b18e59197a97","Type":"ContainerStarted","Data":"4cefb5af2b87132af75400f45869dbdddc545effd5fbb48f540fa32dc4e54ac1"} Apr 23 14:56:55.785961 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.785908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4q5x" event={"ID":"b6751e3c-d6ad-40a0-9acc-87ab97b33923","Type":"ContainerStarted","Data":"2185d095f45f979791e891e0c4a27dc7f0d46eef2d162ea2935198123417bec4"} Apr 23 14:56:55.791828 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.791790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" event={"ID":"bff84cbb0dcf4f847d21e0c35b203751","Type":"ContainerStarted","Data":"771fbfa0bbb71ff7cf7624b106347d46fdc6c24379d7d2a773ae09267f5f3809"} Apr 23 14:56:55.793971 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.793922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bvjj4" event={"ID":"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8","Type":"ContainerStarted","Data":"d54a4da9df1f2a3755fcdb9017ede6dbc0221a8fb88395824528ce409bf1c515"} Apr 23 14:56:55.798878 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.798839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerStarted","Data":"77907fe316cdc84cda6e14b86b5119f38f017c5304a24a6b1409acfaf4928076"} Apr 23 14:56:55.802856 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.802810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h6s52" event={"ID":"838ed924-f431-4482-a37d-5836e1570c45","Type":"ContainerStarted","Data":"8bcc08d67c76aa17d0f3ef16b1aa75ec52b513823a5fc7f233165cadc7b6303b"} Apr 23 14:56:55.805579 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:55.805536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"e0e1efb00ac88b71f7b92ebc9ab7cd1570d11ce57de65178f251fbee78711fd0"} Apr 23 14:56:56.315120 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.314993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:56.315290 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.315135 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:56.315290 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.315231 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:58.315209842 +0000 UTC m=+6.108563010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:56.517027 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.516986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:56.517225 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.517207 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:56:56.517287 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.517227 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:56:56.517287 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.517239 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:56.517379 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.517297 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:56:58.517277459 +0000 UTC m=+6.310630610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:56.764321 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.764250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:56.764721 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.764382 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:56:56.764721 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.764438 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:56.764721 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:56.764619 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:56:56.813018 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.812979 2569 generic.go:358] "Generic (PLEG): container finished" podID="1dd5991561291eb60cf7712bf981919b" containerID="fb0b501ffee1159b947d96f67992e0744a61b6e4c839b7f917d5039c2f26223c" exitCode=0 Apr 23 14:56:56.813205 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.813153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" event={"ID":"1dd5991561291eb60cf7712bf981919b","Type":"ContainerDied","Data":"fb0b501ffee1159b947d96f67992e0744a61b6e4c839b7f917d5039c2f26223c"} Apr 23 14:56:56.831420 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:56.831369 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-199.ec2.internal" podStartSLOduration=2.831349352 podStartE2EDuration="2.831349352s" podCreationTimestamp="2026-04-23 14:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:56:55.808091962 +0000 UTC m=+3.601445132" watchObservedRunningTime="2026-04-23 14:56:56.831349352 +0000 UTC m=+4.624702520" Apr 23 14:56:57.819146 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:57.818449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" event={"ID":"1dd5991561291eb60cf7712bf981919b","Type":"ContainerStarted","Data":"9d1e53bf9918216bff4686d6b8e6be949d33dab8b3943fafb78ece6742f4d349"} Apr 23 14:56:58.329930 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:58.329896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:58.330133 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.330052 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:58.330133 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.330127 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:02.330092942 +0000 UTC m=+10.123446099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:56:58.531298 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:58.531265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:58.531475 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.531423 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:56:58.531475 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.531442 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:56:58.531475 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.531454 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:58.531606 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.531513 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:02.531495361 +0000 UTC m=+10.324848505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:56:58.765749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:58.763771 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:56:58.765749 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.763903 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:56:58.765749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:56:58.763953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:56:58.765749 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:56:58.764074 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:00.762683 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:00.762651 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:00.763136 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:00.762694 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:00.763136 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:00.762793 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:00.763270 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:00.763246 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:02.361541 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:02.361501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:02.361993 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.361640 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:02.361993 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.361712 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:10.361692063 +0000 UTC m=+18.155045210 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:02.563676 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:02.563579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:02.563862 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.563772 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:02.563862 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.563791 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:02.563862 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.563804 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:02.563862 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.563862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:10.563843499 +0000 UTC m=+18.357196656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:02.763459 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:02.763377 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:02.763459 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:02.763425 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:02.763632 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.763512 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:02.763665 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:02.763645 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:04.762543 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:04.762502 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:04.763006 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:04.762554 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:04.763006 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:04.762614 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:04.763006 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:04.762773 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:06.763255 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:06.763176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:06.763711 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:06.763176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:06.763711 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:06.763310 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:06.763711 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:06.763365 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:08.762899 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:08.762860 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:08.763386 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:08.762861 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:08.763386 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:08.763005 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:08.763386 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:08.763084 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:10.415432 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:10.415392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:10.415933 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.415568 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:10.415933 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.415647 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:26.415624961 +0000 UTC m=+34.208978128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:10.616483 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:10.616446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:10.616667 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.616572 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:10.616667 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.616591 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:10.616667 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.616605 2569 projected.go:194] Error preparing data for projected volume kube-api-access-lt5kh for pod openshift-network-diagnostics/network-check-target-6w94x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:10.616667 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.616665 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh podName:56883bac-9d1b-41b2-a97d-66c4b6485777 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:26.616649806 +0000 UTC m=+34.410002949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lt5kh" (UniqueName: "kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh") pod "network-check-target-6w94x" (UID: "56883bac-9d1b-41b2-a97d-66c4b6485777") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:10.763225 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:10.763143 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:10.763380 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:10.763159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:10.763380 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.763282 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:10.763480 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:10.763372 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:11.842857 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.842826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h6s52" event={"ID":"838ed924-f431-4482-a37d-5836e1570c45","Type":"ContainerStarted","Data":"03c1b219a6b1e24f617cf7d796bc289b6973cf42abbb7982b6b03caed468e3a0"} Apr 23 14:57:11.846211 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.846140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" event={"ID":"5ec98e0e-ddcd-4e32-9712-870091ed1acb","Type":"ContainerStarted","Data":"68a0c8b171b1a3bed4c3f88a7004df0e17bf03ae6cc77a3ffb01eadb30cd6657"} Apr 23 14:57:11.848840 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.848813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4q5x" event={"ID":"b6751e3c-d6ad-40a0-9acc-87ab97b33923","Type":"ContainerStarted","Data":"906a1e517c6bbdff6fe8071ac2ea31f9ea040b3cbc0f24f44e314c11c3167fe8"} Apr 23 14:57:11.850389 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.850127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bvjj4" event={"ID":"2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8","Type":"ContainerStarted","Data":"47a7c461430d2bffa897b76cd870ce306717d7501d379a9b32d673d8bd65a3e3"} Apr 23 14:57:11.857899 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.857618 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-199.ec2.internal" podStartSLOduration=17.857602699 podStartE2EDuration="17.857602699s" podCreationTimestamp="2026-04-23 14:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:56:57.839030252 +0000 UTC m=+5.632383462" watchObservedRunningTime="2026-04-23 14:57:11.857602699 +0000 UTC m=+19.650955865" Apr 23 14:57:11.874010 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.873947 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h6s52" podStartSLOduration=3.6297024589999998 podStartE2EDuration="19.873923173s" podCreationTimestamp="2026-04-23 14:56:52 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.330688578 +0000 UTC m=+3.124041722" lastFinishedPulling="2026-04-23 14:57:11.574909288 +0000 UTC m=+19.368262436" observedRunningTime="2026-04-23 14:57:11.857464511 +0000 UTC m=+19.650817676" watchObservedRunningTime="2026-04-23 14:57:11.873923173 +0000 UTC m=+19.667276340" Apr 23 14:57:11.892488 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.892442 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7bjrs" podStartSLOduration=3.594701895 podStartE2EDuration="19.892427377s" podCreationTimestamp="2026-04-23 14:56:52 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.307837004 +0000 UTC m=+3.101190153" lastFinishedPulling="2026-04-23 14:57:11.605562476 +0000 UTC m=+19.398915635" observedRunningTime="2026-04-23 14:57:11.875692838 +0000 UTC m=+19.669046002" watchObservedRunningTime="2026-04-23 14:57:11.892427377 +0000 UTC m=+19.685780553" Apr 23 14:57:11.892694 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.892672 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s4q5x" podStartSLOduration=2.586950569 podStartE2EDuration="18.892668135s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.304313653 +0000 UTC m=+3.097666798" lastFinishedPulling="2026-04-23 14:57:11.610031208 +0000 UTC m=+19.403384364" observedRunningTime="2026-04-23 14:57:11.892269556 +0000 UTC m=+19.685622722" watchObservedRunningTime="2026-04-23 14:57:11.892668135 +0000 UTC m=+19.686021298" Apr 23 14:57:11.906289 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:11.906252 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bvjj4" podStartSLOduration=2.66605066 podStartE2EDuration="18.906239917s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.334714519 +0000 UTC m=+3.128067664" lastFinishedPulling="2026-04-23 14:57:11.574903765 +0000 UTC m=+19.368256921" observedRunningTime="2026-04-23 14:57:11.90613017 +0000 UTC m=+19.699483337" watchObservedRunningTime="2026-04-23 14:57:11.906239917 +0000 UTC m=+19.699593080" Apr 23 14:57:12.203245 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.202986 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4665j"] Apr 23 14:57:12.205756 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.205741 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.208155 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.208132 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 14:57:12.208421 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.208158 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-m88kl\"" Apr 23 14:57:12.208421 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.208132 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 14:57:12.327917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.327884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d27qs\" (UniqueName: \"kubernetes.io/projected/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-kube-api-access-d27qs\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.327917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.327929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-tmp-dir\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.328134 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.327966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-hosts-file\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.428908 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.428868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-hosts-file\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.429062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.428983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-hosts-file\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.429062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.428985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d27qs\" (UniqueName: \"kubernetes.io/projected/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-kube-api-access-d27qs\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.429062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.429038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-tmp-dir\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.429308 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.429295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-tmp-dir\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.440128 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.440087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d27qs\" (UniqueName: \"kubernetes.io/projected/f947ddb6-797b-4afb-a2cf-6c8c70291f6d-kube-api-access-d27qs\") pod \"node-resolver-4665j\" (UID: \"f947ddb6-797b-4afb-a2cf-6c8c70291f6d\") " pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.514268 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.514178 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4665j" Apr 23 14:57:12.580236 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:12.580206 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf947ddb6_797b_4afb_a2cf_6c8c70291f6d.slice/crio-ed519139926264a62d5d6a94c0c48daa2c4847d58d846fe241dc75e4282f79e8 WatchSource:0}: Error finding container ed519139926264a62d5d6a94c0c48daa2c4847d58d846fe241dc75e4282f79e8: Status 404 returned error can't find the container with id ed519139926264a62d5d6a94c0c48daa2c4847d58d846fe241dc75e4282f79e8 Apr 23 14:57:12.740516 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.740490 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 14:57:12.763013 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.762911 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T14:57:12.740512554Z","UUID":"ec744a35-2803-4b50-952b-e0849c94084a","Handler":null,"Name":"","Endpoint":""} Apr 23 14:57:12.763544 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.763525 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:12.763621 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:12.763599 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:12.763621 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.763606 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:12.763864 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:12.763730 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:12.766023 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.765972 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 14:57:12.766023 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.765997 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 14:57:12.853062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.853027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" event={"ID":"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f","Type":"ContainerStarted","Data":"4251c4263f5dc5ffc13351a60bbe02754b91c61a2096ba544e341b8c6dcbe36d"} Apr 23 14:57:12.853062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.853063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" event={"ID":"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f","Type":"ContainerStarted","Data":"71f1ba0b8dadf71fdba42be4756e2fcf4c6d9769f3b7c65afe13b11f329d09a0"} Apr 23 14:57:12.855618 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.854407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4665j" event={"ID":"f947ddb6-797b-4afb-a2cf-6c8c70291f6d","Type":"ContainerStarted","Data":"df963eca3ec43f363bac6abc32d90215ebf0542b568c4c425fd9aa658ecb3dba"} Apr 23 14:57:12.855618 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.854440 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4665j" event={"ID":"f947ddb6-797b-4afb-a2cf-6c8c70291f6d","Type":"ContainerStarted","Data":"ed519139926264a62d5d6a94c0c48daa2c4847d58d846fe241dc75e4282f79e8"} Apr 23 14:57:12.856817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.856796 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="217754b307c73bcbc211ed4af57f413db1c0693e08a7557db64ae221f475c15b" exitCode=0 Apr 23 14:57:12.856918 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.856880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"217754b307c73bcbc211ed4af57f413db1c0693e08a7557db64ae221f475c15b"} Apr 23 14:57:12.859824 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.859801 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 14:57:12.862724 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862701 2569 generic.go:358] "Generic (PLEG): container finished" podID="a2c6ef5a-00fe-42a6-a297-8b67bf27ea78" containerID="5e7a4d8e2526277f7c46d6b55f08e8cf0e3e9a5a1335fa80e5a599406584dce5" exitCode=1 Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"ca3df444e6367f3f754c4195e747684a5e02b96d427ab528063da113dc74bd9a"} Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"13090d36e9215dd6b31c2adfef3ff76b7b5da42349f64a0f00f9934023f4ba1a"} Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"cb367165e3b819f693516c56242dae22c872d09a1cd9a40eb029acb580d44200"} Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"3894d21407320903b407466cc328968783c08461abd10a5efd3b4ac7017bb5e8"} Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"1a44c942a4bb1097f6a438884a15a6f4e569945a589c2456464177ff5fb40c87"} Apr 23 14:57:12.862817 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.862805 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerDied","Data":"5e7a4d8e2526277f7c46d6b55f08e8cf0e3e9a5a1335fa80e5a599406584dce5"} Apr 23 14:57:12.869872 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:12.869839 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4665j" podStartSLOduration=0.869829273 podStartE2EDuration="869.829273ms" podCreationTimestamp="2026-04-23 14:57:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:12.869295982 +0000 UTC m=+20.662649146" watchObservedRunningTime="2026-04-23 14:57:12.869829273 +0000 UTC m=+20.663182437" Apr 23 14:57:13.866430 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:13.866344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" event={"ID":"2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f","Type":"ContainerStarted","Data":"f27f490d322d0f73ea4fd1d48b11eb08d9888ab30f8d8d8facbdc02c32fb3039"} Apr 23 14:57:13.867563 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:13.867538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84b9q" event={"ID":"8849fada-dac8-4194-8755-b18e59197a97","Type":"ContainerStarted","Data":"9660a3318d8cd411acae7b3467e5099b9aabd3f92bf25fe00ab56aafb7f58826"} Apr 23 14:57:13.882171 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:13.882126 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9wnzd" podStartSLOduration=2.716081199 podStartE2EDuration="20.882095487s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.330735 +0000 UTC m=+3.124088144" lastFinishedPulling="2026-04-23 14:57:13.496749275 +0000 UTC m=+21.290102432" observedRunningTime="2026-04-23 14:57:13.882006082 +0000 UTC m=+21.675359247" watchObservedRunningTime="2026-04-23 14:57:13.882095487 +0000 UTC m=+21.675448652" Apr 23 14:57:13.898094 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:13.898059 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-84b9q" podStartSLOduration=4.601737038 podStartE2EDuration="20.898050825s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.30607254 +0000 UTC m=+3.099425690" lastFinishedPulling="2026-04-23 14:57:11.602386321 +0000 UTC m=+19.395739477" observedRunningTime="2026-04-23 14:57:13.897889456 +0000 UTC m=+21.691242620" watchObservedRunningTime="2026-04-23 14:57:13.898050825 +0000 UTC m=+21.691403989" Apr 23 14:57:14.762742 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:14.762707 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:14.762931 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:14.762722 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:14.762931 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:14.762823 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:14.762931 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:14.762902 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:14.872283 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:14.872254 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 14:57:14.872803 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:14.872598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"141fa8784e3f0cb3348064407b3020dbc48151638183e30d6a745dd5c229cde7"} Apr 23 14:57:15.191323 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:15.191241 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:57:15.191842 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:15.191825 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:57:16.763017 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:16.762930 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:16.763017 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:16.762987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:16.763554 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:16.763074 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:16.763554 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:16.763200 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:17.216511 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.216325 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-whs7j"] Apr 23 14:57:17.252928 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.252907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.253033 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:17.252975 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-whs7j" podUID="dce2b511-080e-42a7-a345-5e616c565b84" Apr 23 14:57:17.364542 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.364509 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-dbus\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.364705 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.364560 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-kubelet-config\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.364705 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.364576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.465681 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.465653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-dbus\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.465822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.465709 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-kubelet-config\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.465822 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.465731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.465917 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:17.465842 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:17.465917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.465844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-kubelet-config\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.465917 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:17.465900 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret podName:dce2b511-080e-42a7-a345-5e616c565b84 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:17.965882145 +0000 UTC m=+25.759235308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret") pod "global-pull-secret-syncer-whs7j" (UID: "dce2b511-080e-42a7-a345-5e616c565b84") : object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:17.465917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.465898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dce2b511-080e-42a7-a345-5e616c565b84-dbus\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.879299 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.879267 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="56fd21ef8adb9527c430a101fdf97ebc36f24703ecb43b8cde17e8e50ba578d0" exitCode=0 Apr 23 14:57:17.879714 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.879342 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"56fd21ef8adb9527c430a101fdf97ebc36f24703ecb43b8cde17e8e50ba578d0"} Apr 23 14:57:17.882056 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.882038 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 14:57:17.882359 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.882337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"2e31e88a9c45795321c5887e90dd0d19be204316b6cf9bdbfb0605dcac35117e"} Apr 23 14:57:17.882655 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.882635 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:17.882751 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.882668 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:17.882870 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.882850 2569 scope.go:117] "RemoveContainer" containerID="5e7a4d8e2526277f7c46d6b55f08e8cf0e3e9a5a1335fa80e5a599406584dce5" Apr 23 14:57:17.898513 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.898494 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:17.970938 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:17.970904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:17.971130 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:17.971090 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:17.971198 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:17.971187 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret podName:dce2b511-080e-42a7-a345-5e616c565b84 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:18.971167084 +0000 UTC m=+26.764520237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret") pod "global-pull-secret-syncer-whs7j" (UID: "dce2b511-080e-42a7-a345-5e616c565b84") : object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:18.763272 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.763076 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:18.763411 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.763115 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:18.763411 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:18.763382 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-whs7j" podUID="dce2b511-080e-42a7-a345-5e616c565b84" Apr 23 14:57:18.763522 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.763132 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:18.763522 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:18.763431 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:18.763594 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:18.763518 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:18.887789 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.887758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 14:57:18.888231 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.888092 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" event={"ID":"a2c6ef5a-00fe-42a6-a297-8b67bf27ea78","Type":"ContainerStarted","Data":"4ea803440fea755ded667f9db278fe8bbd7f9d2bf5d4c47a70b993d248360e9b"} Apr 23 14:57:18.888498 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.888473 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:18.902843 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.902823 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:18.920071 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.920031 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" podStartSLOduration=9.584777734 podStartE2EDuration="25.920016872s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.309438308 +0000 UTC m=+3.102791457" lastFinishedPulling="2026-04-23 14:57:11.644677433 +0000 UTC m=+19.438030595" observedRunningTime="2026-04-23 14:57:18.919475153 +0000 UTC m=+26.712828345" watchObservedRunningTime="2026-04-23 14:57:18.920016872 +0000 UTC m=+26.713370037" Apr 23 14:57:18.979216 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:18.979180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:18.979365 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:18.979341 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:18.979441 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:18.979425 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret podName:dce2b511-080e-42a7-a345-5e616c565b84 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:20.979406613 +0000 UTC m=+28.772759776 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret") pod "global-pull-secret-syncer-whs7j" (UID: "dce2b511-080e-42a7-a345-5e616c565b84") : object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:19.165575 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.165543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x9gsg"] Apr 23 14:57:19.165725 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.165635 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:19.165770 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:19.165719 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:19.172292 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172248 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6w94x"] Apr 23 14:57:19.172414 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:19.172483 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:19.172454 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:19.172539 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:57:19.172594 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172579 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 14:57:19.172842 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172817 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-whs7j"] Apr 23 14:57:19.172942 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172884 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:19.172992 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:19.172969 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-whs7j" podUID="dce2b511-080e-42a7-a345-5e616c565b84" Apr 23 14:57:19.173037 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.172995 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h6s52" Apr 23 14:57:19.891459 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.891428 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="92275073e6895e07844c57cd10d5d4576f3345578505c5051597051ae1c63f1f" exitCode=0 Apr 23 14:57:19.891811 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:19.891510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"92275073e6895e07844c57cd10d5d4576f3345578505c5051597051ae1c63f1f"} Apr 23 14:57:20.763543 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:20.763470 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:20.763735 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:20.763579 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:20.763735 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:20.763638 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:20.763838 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:20.763755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:20.763886 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:20.763840 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-whs7j" podUID="dce2b511-080e-42a7-a345-5e616c565b84" Apr 23 14:57:20.763991 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:20.763895 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:20.997564 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:20.997528 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:20.998119 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:20.997760 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:20.998119 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:20.997828 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret podName:dce2b511-080e-42a7-a345-5e616c565b84 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:24.997808213 +0000 UTC m=+32.791161565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret") pod "global-pull-secret-syncer-whs7j" (UID: "dce2b511-080e-42a7-a345-5e616c565b84") : object "kube-system"/"original-pull-secret" not registered Apr 23 14:57:21.897313 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:21.897273 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="9629d68341cd6f68890422aa1c0426f1dd4794f40f84e21f142837b3a6b01626" exitCode=0 Apr 23 14:57:21.897538 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:21.897310 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"9629d68341cd6f68890422aa1c0426f1dd4794f40f84e21f142837b3a6b01626"} Apr 23 14:57:22.764921 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:22.764840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:22.765596 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:22.764966 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-whs7j" podUID="dce2b511-080e-42a7-a345-5e616c565b84" Apr 23 14:57:22.766341 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:22.766319 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:22.766472 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:22.766451 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6w94x" podUID="56883bac-9d1b-41b2-a97d-66c4b6485777" Apr 23 14:57:22.766524 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:22.766513 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:22.766648 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:22.766632 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x9gsg" podUID="36b18c47-1676-4fda-b4e6-a7a9acee20a9" Apr 23 14:57:24.537616 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.537540 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-199.ec2.internal" event="NodeReady" Apr 23 14:57:24.538039 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.537699 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 14:57:24.578487 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.578454 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:57:24.596916 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.596509 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp"] Apr 23 14:57:24.598244 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.597240 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.600632 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.600591 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 14:57:24.600852 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.600827 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dxrvj\"" Apr 23 14:57:24.601258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.601094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 14:57:24.601258 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.601134 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 14:57:24.610600 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.610416 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 14:57:24.611042 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.611018 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn"] Apr 23 14:57:24.611211 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.611182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.615366 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.615288 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:24.615366 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.615332 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 14:57:24.615366 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.615290 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5tpjt\"" Apr 23 14:57:24.615567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.615471 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 14:57:24.615567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.615542 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:24.632701 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.632673 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94tsp"] Apr 23 14:57:24.632849 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.632834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.635314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.635258 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 14:57:24.635314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.635275 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:24.635314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.635282 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 14:57:24.635314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.635282 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-c42hj\"" Apr 23 14:57:24.635604 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.635362 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:24.652828 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652805 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:57:24.652957 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652836 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp"] Apr 23 14:57:24.652957 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652849 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn"] Apr 23 14:57:24.652957 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652862 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94tsp"] Apr 23 14:57:24.652957 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652875 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-stmr6"] Apr 23 14:57:24.652957 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.652898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.656228 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.656195 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:24.656458 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.656443 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:24.656603 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.656586 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 14:57:24.656644 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.656625 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-gnq68\"" Apr 23 14:57:24.656883 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.656838 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 14:57:24.661812 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.661793 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 14:57:24.667670 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.667652 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz"] Apr 23 14:57:24.667823 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.667808 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.670327 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.670309 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 14:57:24.670418 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.670351 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 14:57:24.670583 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.670568 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zm2h4\"" Apr 23 14:57:24.685679 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.685656 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fjp4b"] Apr 23 14:57:24.685801 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.685783 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" Apr 23 14:57:24.688293 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.688273 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 14:57:24.688632 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.688489 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mqxw7\"" Apr 23 14:57:24.688632 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.688527 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 14:57:24.700974 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.700952 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fjp4b"] Apr 23 14:57:24.700974 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.700976 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-stmr6"] Apr 23 14:57:24.701214 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.700986 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz"] Apr 23 14:57:24.701214 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.701088 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.703540 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.703518 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 14:57:24.703638 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.703540 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 14:57:24.703638 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.703579 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 14:57:24.703638 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.703527 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nnkvd\"" Apr 23 14:57:24.726843 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nh8\" (UniqueName: \"kubernetes.io/projected/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-kube-api-access-k5nh8\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.726992 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b31c555-d556-43a6-ab35-99fdd111e5a5-serving-cert\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.726992 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726876 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.726992 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.726992 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dcql\" (UniqueName: \"kubernetes.io/projected/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-kube-api-access-8dcql\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.726997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4fwg\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-config\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-trusted-ca\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727249 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-config\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727317 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.727486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.727411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jflm\" (UniqueName: \"kubernetes.io/projected/8b31c555-d556-43a6-ab35-99fdd111e5a5-kube-api-access-9jflm\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.762747 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.762718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:24.762871 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.762722 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:24.762937 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.762722 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:24.765741 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.765719 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 14:57:24.765875 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.765742 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qqjqd\"" Apr 23 14:57:24.765875 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.765838 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-shvg2\"" Apr 23 14:57:24.766072 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.766053 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 14:57:24.828692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.828615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.828692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.828651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-trusted-ca\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.828692 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.828675 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.828935 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.828698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.828935 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.828729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-config\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.829340 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.829398 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.829442 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829417 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.829482 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-config-volume\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.829517 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829477 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4qq8\" (UniqueName: \"kubernetes.io/projected/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-kube-api-access-s4qq8\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.829517 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhtz\" (UniqueName: \"kubernetes.io/projected/0b89e5a1-6854-4d53-971b-35ff7c61be50-kube-api-access-sfhtz\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.829574 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.829574 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.829631 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829575 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.829631 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-config\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.829631 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jflm\" (UniqueName: \"kubernetes.io/projected/8b31c555-d556-43a6-ab35-99fdd111e5a5-kube-api-access-9jflm\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.829720 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-tmp-dir\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.829720 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.829720 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b31c555-d556-43a6-ab35-99fdd111e5a5-trusted-ca\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.829857 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829797 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.829904 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nh8\" (UniqueName: \"kubernetes.io/projected/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-kube-api-access-k5nh8\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.829904 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b31c555-d556-43a6-ab35-99fdd111e5a5-serving-cert\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.830012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829910 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.830012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.830012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dcql\" (UniqueName: \"kubernetes.io/projected/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-kube-api-access-8dcql\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.830012 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.829996 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48czx\" (UniqueName: \"kubernetes.io/projected/23156db7-0193-437f-a4c9-bc8b886b91e9-kube-api-access-48czx\") pod \"network-check-source-8894fc9bd-4mhjz\" (UID: \"23156db7-0193-437f-a4c9-bc8b886b91e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" Apr 23 14:57:24.830257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.830257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.830257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4fwg\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.830257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-config\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.830609 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-config\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.830704 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.830689 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 14:57:24.830753 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.830708 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cdd955d68-f2g47: secret "image-registry-tls" not found Apr 23 14:57:24.830800 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.830777 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls podName:14dc1b3f-026b-4412-831d-49a1eaa9b470 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:25.330759988 +0000 UTC m=+33.124113146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls") pod "image-registry-6cdd955d68-f2g47" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470") : secret "image-registry-tls" not found Apr 23 14:57:24.830985 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.830952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.834749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.834717 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.834860 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.834793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.834860 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.834822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b31c555-d556-43a6-ab35-99fdd111e5a5-serving-cert\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.834943 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.834923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.835011 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.834991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.851151 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.851128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jflm\" (UniqueName: \"kubernetes.io/projected/8b31c555-d556-43a6-ab35-99fdd111e5a5-kube-api-access-9jflm\") pod \"console-operator-9d4b6777b-94tsp\" (UID: \"8b31c555-d556-43a6-ab35-99fdd111e5a5\") " pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.853383 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.853356 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nh8\" (UniqueName: \"kubernetes.io/projected/5ee82d4e-8f5a-4e80-bed4-26f905b2deef-kube-api-access-k5nh8\") pod \"kube-storage-version-migrator-operator-6769c5d45-b7nsn\" (UID: \"5ee82d4e-8f5a-4e80-bed4-26f905b2deef\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.853495 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.853484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4fwg\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.853862 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.853845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:24.854636 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.854615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dcql\" (UniqueName: \"kubernetes.io/projected/b6fcac8f-1e41-4ff0-8acb-e3115431f3e9-kube-api-access-8dcql\") pod \"service-ca-operator-d6fc45fc5-g2wkp\" (UID: \"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.922030 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.922002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" Apr 23 14:57:24.930946 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.930911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48czx\" (UniqueName: \"kubernetes.io/projected/23156db7-0193-437f-a4c9-bc8b886b91e9-kube-api-access-48czx\") pod \"network-check-source-8894fc9bd-4mhjz\" (UID: \"23156db7-0193-437f-a4c9-bc8b886b91e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" Apr 23 14:57:24.931071 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.930957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.931071 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.931062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-config-volume\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.931207 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.931094 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4qq8\" (UniqueName: \"kubernetes.io/projected/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-kube-api-access-s4qq8\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.931207 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.931141 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhtz\" (UniqueName: \"kubernetes.io/projected/0b89e5a1-6854-4d53-971b-35ff7c61be50-kube-api-access-sfhtz\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.931207 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.931186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-tmp-dir\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.931349 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.931209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.931404 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.931368 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 14:57:24.931456 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.931441 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert podName:0b89e5a1-6854-4d53-971b-35ff7c61be50 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:25.431422527 +0000 UTC m=+33.224775670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert") pod "ingress-canary-fjp4b" (UID: "0b89e5a1-6854-4d53-971b-35ff7c61be50") : secret "canary-serving-cert" not found Apr 23 14:57:24.931814 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.931793 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 14:57:24.931901 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:24.931845 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls podName:fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:25.431832224 +0000 UTC m=+33.225185371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls") pod "dns-default-stmr6" (UID: "fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8") : secret "dns-default-metrics-tls" not found Apr 23 14:57:24.942532 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.942253 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhtz\" (UniqueName: \"kubernetes.io/projected/0b89e5a1-6854-4d53-971b-35ff7c61be50-kube-api-access-sfhtz\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:24.942532 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.942383 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" Apr 23 14:57:24.942532 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.942392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48czx\" (UniqueName: \"kubernetes.io/projected/23156db7-0193-437f-a4c9-bc8b886b91e9-kube-api-access-48czx\") pod \"network-check-source-8894fc9bd-4mhjz\" (UID: \"23156db7-0193-437f-a4c9-bc8b886b91e9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" Apr 23 14:57:24.942532 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.942454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-tmp-dir\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.942761 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.942702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-config-volume\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.945154 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.945136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4qq8\" (UniqueName: \"kubernetes.io/projected/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-kube-api-access-s4qq8\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:24.963714 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.963673 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:24.995749 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:24.995713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" Apr 23 14:57:25.034600 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.034558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:25.043412 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.043378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dce2b511-080e-42a7-a345-5e616c565b84-original-pull-secret\") pod \"global-pull-secret-syncer-whs7j\" (UID: \"dce2b511-080e-42a7-a345-5e616c565b84\") " pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:25.091178 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.090709 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whs7j" Apr 23 14:57:25.124569 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.124542 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp"] Apr 23 14:57:25.132203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.132166 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn"] Apr 23 14:57:25.138825 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:25.138785 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee82d4e_8f5a_4e80_bed4_26f905b2deef.slice/crio-60df4645e7b84b409873b4bb0fa68a14595e74c434cab13dbd67808957026581 WatchSource:0}: Error finding container 60df4645e7b84b409873b4bb0fa68a14595e74c434cab13dbd67808957026581: Status 404 returned error can't find the container with id 60df4645e7b84b409873b4bb0fa68a14595e74c434cab13dbd67808957026581 Apr 23 14:57:25.145925 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.145869 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-94tsp"] Apr 23 14:57:25.150618 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:25.150589 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b31c555_d556_43a6_ab35_99fdd111e5a5.slice/crio-6d75445531d8954383e0133b82de04db9d3c83486566a46559b33448c036366b WatchSource:0}: Error finding container 6d75445531d8954383e0133b82de04db9d3c83486566a46559b33448c036366b: Status 404 returned error can't find the container with id 6d75445531d8954383e0133b82de04db9d3c83486566a46559b33448c036366b Apr 23 14:57:25.182314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.182258 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz"] Apr 23 14:57:25.185631 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:25.185598 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23156db7_0193_437f_a4c9_bc8b886b91e9.slice/crio-f0021b18cb118160842dfddd97db0060b7a9d98d6b4c4bf605a2e57eccbd3a15 WatchSource:0}: Error finding container f0021b18cb118160842dfddd97db0060b7a9d98d6b4c4bf605a2e57eccbd3a15: Status 404 returned error can't find the container with id f0021b18cb118160842dfddd97db0060b7a9d98d6b4c4bf605a2e57eccbd3a15 Apr 23 14:57:25.242499 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.242305 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-whs7j"] Apr 23 14:57:25.245188 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:25.245156 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddce2b511_080e_42a7_a345_5e616c565b84.slice/crio-39f41f7daf0fc03cc54e2c44ecd1fba17d106a6f0714304f89bf33d8a3ba24d0 WatchSource:0}: Error finding container 39f41f7daf0fc03cc54e2c44ecd1fba17d106a6f0714304f89bf33d8a3ba24d0: Status 404 returned error can't find the container with id 39f41f7daf0fc03cc54e2c44ecd1fba17d106a6f0714304f89bf33d8a3ba24d0 Apr 23 14:57:25.336625 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.336540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:25.336775 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.336672 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 14:57:25.336775 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.336693 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cdd955d68-f2g47: secret "image-registry-tls" not found Apr 23 14:57:25.336775 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.336750 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls podName:14dc1b3f-026b-4412-831d-49a1eaa9b470 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:26.336734318 +0000 UTC m=+34.130087481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls") pod "image-registry-6cdd955d68-f2g47" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470") : secret "image-registry-tls" not found Apr 23 14:57:25.437349 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.437314 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:25.437494 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.437366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:25.437494 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.437467 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 14:57:25.437494 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.437484 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 14:57:25.437650 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.437532 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert podName:0b89e5a1-6854-4d53-971b-35ff7c61be50 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:26.437516037 +0000 UTC m=+34.230869184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert") pod "ingress-canary-fjp4b" (UID: "0b89e5a1-6854-4d53-971b-35ff7c61be50") : secret "canary-serving-cert" not found Apr 23 14:57:25.437650 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:25.437551 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls podName:fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:26.437542933 +0000 UTC m=+34.230896097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls") pod "dns-default-stmr6" (UID: "fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8") : secret "dns-default-metrics-tls" not found Apr 23 14:57:25.907355 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.907316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" event={"ID":"5ee82d4e-8f5a-4e80-bed4-26f905b2deef","Type":"ContainerStarted","Data":"60df4645e7b84b409873b4bb0fa68a14595e74c434cab13dbd67808957026581"} Apr 23 14:57:25.908946 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.908916 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" event={"ID":"8b31c555-d556-43a6-ab35-99fdd111e5a5","Type":"ContainerStarted","Data":"6d75445531d8954383e0133b82de04db9d3c83486566a46559b33448c036366b"} Apr 23 14:57:25.910498 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.910468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-whs7j" event={"ID":"dce2b511-080e-42a7-a345-5e616c565b84","Type":"ContainerStarted","Data":"39f41f7daf0fc03cc54e2c44ecd1fba17d106a6f0714304f89bf33d8a3ba24d0"} Apr 23 14:57:25.911519 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.911474 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" event={"ID":"23156db7-0193-437f-a4c9-bc8b886b91e9","Type":"ContainerStarted","Data":"f0021b18cb118160842dfddd97db0060b7a9d98d6b4c4bf605a2e57eccbd3a15"} Apr 23 14:57:25.912534 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:25.912498 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" event={"ID":"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9","Type":"ContainerStarted","Data":"43d52cf538a4ebd61ce92b871e6c338befbf0f1d00b78776343ee701160253f2"} Apr 23 14:57:26.343984 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.343948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:26.344141 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.344096 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 14:57:26.344141 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.344131 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cdd955d68-f2g47: secret "image-registry-tls" not found Apr 23 14:57:26.344227 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.344185 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls podName:14dc1b3f-026b-4412-831d-49a1eaa9b470 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.344169867 +0000 UTC m=+36.137523014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls") pod "image-registry-6cdd955d68-f2g47" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470") : secret "image-registry-tls" not found Apr 23 14:57:26.445314 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.445272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:26.445495 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.445336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:26.445495 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.445392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:26.445495 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445431 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 14:57:26.445495 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445488 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 14:57:26.445495 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445493 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert podName:0b89e5a1-6854-4d53-971b-35ff7c61be50 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.445476 +0000 UTC m=+36.238829163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert") pod "ingress-canary-fjp4b" (UID: "0b89e5a1-6854-4d53-971b-35ff7c61be50") : secret "canary-serving-cert" not found Apr 23 14:57:26.445760 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445507 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 14:57:26.445760 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445518 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs podName:36b18c47-1676-4fda-b4e6-a7a9acee20a9 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:58.445509423 +0000 UTC m=+66.238862566 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs") pod "network-metrics-daemon-x9gsg" (UID: "36b18c47-1676-4fda-b4e6-a7a9acee20a9") : secret "metrics-daemon-secret" not found Apr 23 14:57:26.445760 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:26.445588 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls podName:fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.445570677 +0000 UTC m=+36.238923835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls") pod "dns-default-stmr6" (UID: "fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8") : secret "dns-default-metrics-tls" not found Apr 23 14:57:26.647134 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.647029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:26.650728 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.650701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5kh\" (UniqueName: \"kubernetes.io/projected/56883bac-9d1b-41b2-a97d-66c4b6485777-kube-api-access-lt5kh\") pod \"network-check-target-6w94x\" (UID: \"56883bac-9d1b-41b2-a97d-66c4b6485777\") " pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:26.886863 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:26.886831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:27.035062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.034989 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bdftp"] Apr 23 14:57:27.058912 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.058883 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bdftp"] Apr 23 14:57:27.059057 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.059008 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.061501 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.061433 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 14:57:27.061501 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.061471 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 14:57:27.061697 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.061581 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mccj5\"" Apr 23 14:57:27.150842 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.150794 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.150842 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.150845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4ffefc2c-3199-4193-9958-9394681000af-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.251966 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.251927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.251966 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.251967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4ffefc2c-3199-4193-9958-9394681000af-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.252229 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:27.252096 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:27.252229 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:27.252194 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:27.752172534 +0000 UTC m=+35.545525682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:27.252669 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.252638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4ffefc2c-3199-4193-9958-9394681000af-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.755999 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:27.755961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:27.756183 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:27.756149 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:27.756246 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:27.756232 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.756210767 +0000 UTC m=+36.549563920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:28.362375 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.361720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:28.362375 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.361922 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 14:57:28.362375 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.361938 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cdd955d68-f2g47: secret "image-registry-tls" not found Apr 23 14:57:28.362375 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.361995 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls podName:14dc1b3f-026b-4412-831d-49a1eaa9b470 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:32.361975422 +0000 UTC m=+40.155328571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls") pod "image-registry-6cdd955d68-f2g47" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470") : secret "image-registry-tls" not found Apr 23 14:57:28.462486 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.462439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:28.462646 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.462500 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:28.462702 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.462684 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 14:57:28.462766 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.462744 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls podName:fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:32.462725432 +0000 UTC m=+40.256078576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls") pod "dns-default-stmr6" (UID: "fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8") : secret "dns-default-metrics-tls" not found Apr 23 14:57:28.462827 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.462803 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 14:57:28.462878 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.462835 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert podName:0b89e5a1-6854-4d53-971b-35ff7c61be50 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:32.462825269 +0000 UTC m=+40.256178420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert") pod "ingress-canary-fjp4b" (UID: "0b89e5a1-6854-4d53-971b-35ff7c61be50") : secret "canary-serving-cert" not found Apr 23 14:57:28.766048 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.765973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:28.766401 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.766155 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:28.766401 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:28.766287 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:30.766266423 +0000 UTC m=+38.559619586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:28.885927 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.885873 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6w94x"] Apr 23 14:57:28.933373 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:28.933337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6w94x" event={"ID":"56883bac-9d1b-41b2-a97d-66c4b6485777","Type":"ContainerStarted","Data":"34967f1d54776cc15c91a9c014cdf21e6b2ba67bfa8de30071fd16b1bf7908ad"} Apr 23 14:57:29.938805 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:29.938452 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="ed9f18a9ff38b71bc0d395d80d2a8aa40b11a2eeddeddc0f8c22681076c8ae80" exitCode=0 Apr 23 14:57:29.938805 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:29.938757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"ed9f18a9ff38b71bc0d395d80d2a8aa40b11a2eeddeddc0f8c22681076c8ae80"} Apr 23 14:57:29.940613 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:29.940507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" event={"ID":"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9","Type":"ContainerStarted","Data":"dd927d18dce0852965c4a9aa2f8274d3273bb654678a866bd2361e4a5cee73a5"} Apr 23 14:57:30.787014 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:30.786981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:30.787206 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:30.787152 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:30.787275 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:30.787216 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:34.787199822 +0000 UTC m=+42.580552970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:32.404058 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:32.404018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:32.404421 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.404169 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 14:57:32.404421 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.404191 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cdd955d68-f2g47: secret "image-registry-tls" not found Apr 23 14:57:32.404421 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.404258 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls podName:14dc1b3f-026b-4412-831d-49a1eaa9b470 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:40.404241694 +0000 UTC m=+48.197594837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls") pod "image-registry-6cdd955d68-f2g47" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470") : secret "image-registry-tls" not found Apr 23 14:57:32.504942 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:32.504904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:32.505124 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:32.504957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:32.505124 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.505067 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 14:57:32.505124 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.505082 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 14:57:32.505245 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.505157 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert podName:0b89e5a1-6854-4d53-971b-35ff7c61be50 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:40.505134294 +0000 UTC m=+48.298487461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert") pod "ingress-canary-fjp4b" (UID: "0b89e5a1-6854-4d53-971b-35ff7c61be50") : secret "canary-serving-cert" not found Apr 23 14:57:32.505245 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:32.505179 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls podName:fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8 nodeName:}" failed. No retries permitted until 2026-04-23 14:57:40.505169423 +0000 UTC m=+48.298522567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls") pod "dns-default-stmr6" (UID: "fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8") : secret "dns-default-metrics-tls" not found Apr 23 14:57:32.795938 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:32.795879 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" podStartSLOduration=32.179366445 podStartE2EDuration="35.795861947s" podCreationTimestamp="2026-04-23 14:56:57 +0000 UTC" firstStartedPulling="2026-04-23 14:57:25.131312926 +0000 UTC m=+32.924666082" lastFinishedPulling="2026-04-23 14:57:28.74780842 +0000 UTC m=+36.541161584" observedRunningTime="2026-04-23 14:57:29.994653647 +0000 UTC m=+37.788006828" watchObservedRunningTime="2026-04-23 14:57:32.795861947 +0000 UTC m=+40.589215113" Apr 23 14:57:33.717521 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.717489 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xl547"] Apr 23 14:57:33.749933 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.749903 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xl547"] Apr 23 14:57:33.750090 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.750019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.753010 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.752980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 14:57:33.753172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.753073 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 14:57:33.753236 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.753178 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 14:57:33.757636 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.754232 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 14:57:33.757636 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.754320 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mp7ss\"" Apr 23 14:57:33.816649 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.816613 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-key\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.816783 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.816681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-cabundle\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.816783 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.816707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktg4w\" (UniqueName: \"kubernetes.io/projected/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-kube-api-access-ktg4w\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.916974 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.916953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-key\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.917089 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.917011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-cabundle\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.917170 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.917143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktg4w\" (UniqueName: \"kubernetes.io/projected/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-kube-api-access-ktg4w\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.917630 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.917613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-cabundle\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.920405 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.920376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-signing-key\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:33.925263 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:33.925241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktg4w\" (UniqueName: \"kubernetes.io/projected/8c0d3f6a-dc3f-468c-b4e4-20234fca5855-kube-api-access-ktg4w\") pod \"service-ca-865cb79987-xl547\" (UID: \"8c0d3f6a-dc3f-468c-b4e4-20234fca5855\") " pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:34.061717 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.061688 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xl547" Apr 23 14:57:34.372067 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.372038 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xl547"] Apr 23 14:57:34.384618 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:34.384560 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0d3f6a_dc3f_468c_b4e4_20234fca5855.slice/crio-163c4a9dc25b0b93bad092ae85f98e700ece85953e685d975f0509632f55c22e WatchSource:0}: Error finding container 163c4a9dc25b0b93bad092ae85f98e700ece85953e685d975f0509632f55c22e: Status 404 returned error can't find the container with id 163c4a9dc25b0b93bad092ae85f98e700ece85953e685d975f0509632f55c22e Apr 23 14:57:34.825607 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.825568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:34.826090 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:34.825733 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:34.826090 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:34.825791 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:42.82577608 +0000 UTC m=+50.619129246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:34.953676 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.953639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" event={"ID":"5ee82d4e-8f5a-4e80-bed4-26f905b2deef","Type":"ContainerStarted","Data":"a7c6c5fa822a34c6ce62a5dfe4eeb27b16c55c6ab74c40bbbf4c48235f69bd7d"} Apr 23 14:57:34.955077 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.955055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/0.log" Apr 23 14:57:34.955236 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.955093 2569 generic.go:358] "Generic (PLEG): container finished" podID="8b31c555-d556-43a6-ab35-99fdd111e5a5" containerID="0522a66f988b33eef25615af256d7d53d542313b96111f16da6b050f34d3dfb8" exitCode=255 Apr 23 14:57:34.955236 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.955185 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" event={"ID":"8b31c555-d556-43a6-ab35-99fdd111e5a5","Type":"ContainerDied","Data":"0522a66f988b33eef25615af256d7d53d542313b96111f16da6b050f34d3dfb8"} Apr 23 14:57:34.955364 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.955348 2569 scope.go:117] "RemoveContainer" containerID="0522a66f988b33eef25615af256d7d53d542313b96111f16da6b050f34d3dfb8" Apr 23 14:57:34.956554 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.956531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xl547" event={"ID":"8c0d3f6a-dc3f-468c-b4e4-20234fca5855","Type":"ContainerStarted","Data":"7e5bc280259bc61a58db827e4a47e86581794eff0a18e89cc0c16d365aaf6e43"} Apr 23 14:57:34.956656 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.956560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xl547" event={"ID":"8c0d3f6a-dc3f-468c-b4e4-20234fca5855","Type":"ContainerStarted","Data":"163c4a9dc25b0b93bad092ae85f98e700ece85953e685d975f0509632f55c22e"} Apr 23 14:57:34.964173 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.963809 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:34.964278 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.964185 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:34.965120 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.964988 2569 generic.go:358] "Generic (PLEG): container finished" podID="25a96c44-032a-41ac-8ed7-c051e3666b8c" containerID="d4245f493b280652aee303c1549b4f5703ad5ac97c3c273c514d2289f504a1cc" exitCode=0 Apr 23 14:57:34.965120 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.965057 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerDied","Data":"d4245f493b280652aee303c1549b4f5703ad5ac97c3c273c514d2289f504a1cc"} Apr 23 14:57:34.966488 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.966466 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-whs7j" event={"ID":"dce2b511-080e-42a7-a345-5e616c565b84","Type":"ContainerStarted","Data":"48a290bf92a214f09c528b185b3dc03cfbf8d025fd7f2bbde7dcb3b039544945"} Apr 23 14:57:34.968807 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.968778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" event={"ID":"23156db7-0193-437f-a4c9-bc8b886b91e9","Type":"ContainerStarted","Data":"3f8ac88f86010f2f68a86e1ba0272bee92e3ebb41ce3d5451be907da4bc793a7"} Apr 23 14:57:34.970084 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.970067 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6w94x" event={"ID":"56883bac-9d1b-41b2-a97d-66c4b6485777","Type":"ContainerStarted","Data":"33ea6a25062046e7c74a3510eb9b48f1207cf1a9fb6edffd01888a64174e6321"} Apr 23 14:57:34.970227 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.970211 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:57:34.972514 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:34.971955 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" podStartSLOduration=29.21474274 podStartE2EDuration="37.971941981s" podCreationTimestamp="2026-04-23 14:56:57 +0000 UTC" firstStartedPulling="2026-04-23 14:57:25.141870307 +0000 UTC m=+32.935223454" lastFinishedPulling="2026-04-23 14:57:33.899069538 +0000 UTC m=+41.692422695" observedRunningTime="2026-04-23 14:57:34.970978745 +0000 UTC m=+42.764331903" watchObservedRunningTime="2026-04-23 14:57:34.971941981 +0000 UTC m=+42.765295147" Apr 23 14:57:35.003931 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.003881 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-whs7j" podStartSLOduration=9.001158348 podStartE2EDuration="18.003866685s" podCreationTimestamp="2026-04-23 14:57:17 +0000 UTC" firstStartedPulling="2026-04-23 14:57:25.247151134 +0000 UTC m=+33.040504282" lastFinishedPulling="2026-04-23 14:57:34.249859471 +0000 UTC m=+42.043212619" observedRunningTime="2026-04-23 14:57:34.987875734 +0000 UTC m=+42.781228900" watchObservedRunningTime="2026-04-23 14:57:35.003866685 +0000 UTC m=+42.797219851" Apr 23 14:57:35.004078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.004047 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4mhjz" podStartSLOduration=26.282509009 podStartE2EDuration="35.004040996s" podCreationTimestamp="2026-04-23 14:57:00 +0000 UTC" firstStartedPulling="2026-04-23 14:57:25.187202125 +0000 UTC m=+32.980555269" lastFinishedPulling="2026-04-23 14:57:33.908734101 +0000 UTC m=+41.702087256" observedRunningTime="2026-04-23 14:57:35.002634974 +0000 UTC m=+42.795988142" watchObservedRunningTime="2026-04-23 14:57:35.004040996 +0000 UTC m=+42.797394160" Apr 23 14:57:35.047484 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.047433 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6w94x" podStartSLOduration=36.539050072 podStartE2EDuration="42.047415257s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:57:28.898914165 +0000 UTC m=+36.692267320" lastFinishedPulling="2026-04-23 14:57:34.407279347 +0000 UTC m=+42.200632505" observedRunningTime="2026-04-23 14:57:35.046359133 +0000 UTC m=+42.839712303" watchObservedRunningTime="2026-04-23 14:57:35.047415257 +0000 UTC m=+42.840768423" Apr 23 14:57:35.119576 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.119150 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xl547" podStartSLOduration=2.119130555 podStartE2EDuration="2.119130555s" podCreationTimestamp="2026-04-23 14:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:35.117691434 +0000 UTC m=+42.911044600" watchObservedRunningTime="2026-04-23 14:57:35.119130555 +0000 UTC m=+42.912483718" Apr 23 14:57:35.975679 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.975650 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 14:57:35.976140 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.976126 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/0.log" Apr 23 14:57:35.976205 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.976165 2569 generic.go:358] "Generic (PLEG): container finished" podID="8b31c555-d556-43a6-ab35-99fdd111e5a5" containerID="1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8" exitCode=255 Apr 23 14:57:35.976257 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.976230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" event={"ID":"8b31c555-d556-43a6-ab35-99fdd111e5a5","Type":"ContainerDied","Data":"1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8"} Apr 23 14:57:35.976308 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.976283 2569 scope.go:117] "RemoveContainer" containerID="0522a66f988b33eef25615af256d7d53d542313b96111f16da6b050f34d3dfb8" Apr 23 14:57:35.976528 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.976514 2569 scope.go:117] "RemoveContainer" containerID="1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8" Apr 23 14:57:35.976808 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:35.976742 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94tsp_openshift-console-operator(8b31c555-d556-43a6-ab35-99fdd111e5a5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" podUID="8b31c555-d556-43a6-ab35-99fdd111e5a5" Apr 23 14:57:35.980078 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:35.980043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" event={"ID":"25a96c44-032a-41ac-8ed7-c051e3666b8c","Type":"ContainerStarted","Data":"97bc6c4c0a6f7b7335ee102418bd85f25eb1705ab961a2269ce0ddf69e45376a"} Apr 23 14:57:36.036292 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:36.036232 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jnxqk" podStartSLOduration=9.622085746 podStartE2EDuration="43.03621661s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:56:55.334883322 +0000 UTC m=+3.128236468" lastFinishedPulling="2026-04-23 14:57:28.749014189 +0000 UTC m=+36.542367332" observedRunningTime="2026-04-23 14:57:36.034065287 +0000 UTC m=+43.827418452" watchObservedRunningTime="2026-04-23 14:57:36.03621661 +0000 UTC m=+43.829569780" Apr 23 14:57:36.983982 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:36.983953 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 14:57:36.984432 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:36.984291 2569 scope.go:117] "RemoveContainer" containerID="1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8" Apr 23 14:57:36.984472 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:36.984451 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94tsp_openshift-console-operator(8b31c555-d556-43a6-ab35-99fdd111e5a5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" podUID="8b31c555-d556-43a6-ab35-99fdd111e5a5" Apr 23 14:57:40.472324 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.472281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:40.474811 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.474781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"image-registry-6cdd955d68-f2g47\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:40.511665 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.511630 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:40.573204 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.573121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:40.573204 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.573185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:40.576029 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.575971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8-metrics-tls\") pod \"dns-default-stmr6\" (UID: \"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8\") " pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:40.578316 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.577933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:40.578441 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.578314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b89e5a1-6854-4d53-971b-35ff7c61be50-cert\") pod \"ingress-canary-fjp4b\" (UID: \"0b89e5a1-6854-4d53-971b-35ff7c61be50\") " pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:40.611191 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.611157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fjp4b" Apr 23 14:57:40.648843 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.648777 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:57:40.652948 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:40.652918 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14dc1b3f_026b_4412_831d_49a1eaa9b470.slice/crio-e98889a56edd66cfc11cf50823585555e02bc4e7052bbd73884a25b3caa1fb5f WatchSource:0}: Error finding container e98889a56edd66cfc11cf50823585555e02bc4e7052bbd73884a25b3caa1fb5f: Status 404 returned error can't find the container with id e98889a56edd66cfc11cf50823585555e02bc4e7052bbd73884a25b3caa1fb5f Apr 23 14:57:40.714573 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.714546 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-stmr6"] Apr 23 14:57:40.718090 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:40.718062 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0ff8bf_abfa_447e_b6cb_c0cac33c7ea8.slice/crio-8550dfe50c201f5390a3c89e061643bd8583b25b37884858fab925ee8ca8ab4a WatchSource:0}: Error finding container 8550dfe50c201f5390a3c89e061643bd8583b25b37884858fab925ee8ca8ab4a: Status 404 returned error can't find the container with id 8550dfe50c201f5390a3c89e061643bd8583b25b37884858fab925ee8ca8ab4a Apr 23 14:57:40.749178 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.749151 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fjp4b"] Apr 23 14:57:40.752807 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:40.752782 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b89e5a1_6854_4d53_971b_35ff7c61be50.slice/crio-d8a39edb9444630287a86d3082effe91233f7a843dcbd01cec125a5783f8a88e WatchSource:0}: Error finding container d8a39edb9444630287a86d3082effe91233f7a843dcbd01cec125a5783f8a88e: Status 404 returned error can't find the container with id d8a39edb9444630287a86d3082effe91233f7a843dcbd01cec125a5783f8a88e Apr 23 14:57:40.996418 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.996333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" event={"ID":"14dc1b3f-026b-4412-831d-49a1eaa9b470","Type":"ContainerStarted","Data":"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a"} Apr 23 14:57:40.996418 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.996368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" event={"ID":"14dc1b3f-026b-4412-831d-49a1eaa9b470","Type":"ContainerStarted","Data":"e98889a56edd66cfc11cf50823585555e02bc4e7052bbd73884a25b3caa1fb5f"} Apr 23 14:57:40.996627 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.996420 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:57:40.997291 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.997270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fjp4b" event={"ID":"0b89e5a1-6854-4d53-971b-35ff7c61be50","Type":"ContainerStarted","Data":"d8a39edb9444630287a86d3082effe91233f7a843dcbd01cec125a5783f8a88e"} Apr 23 14:57:40.998234 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:40.998214 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stmr6" event={"ID":"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8","Type":"ContainerStarted","Data":"8550dfe50c201f5390a3c89e061643bd8583b25b37884858fab925ee8ca8ab4a"} Apr 23 14:57:41.022153 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:41.022091 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" podStartSLOduration=48.022077545 podStartE2EDuration="48.022077545s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:41.021924927 +0000 UTC m=+48.815278127" watchObservedRunningTime="2026-04-23 14:57:41.022077545 +0000 UTC m=+48.815430709" Apr 23 14:57:42.891131 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:42.891079 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:42.891495 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:42.891228 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 14:57:42.891495 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:42.891293 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert podName:4ffefc2c-3199-4193-9958-9394681000af nodeName:}" failed. No retries permitted until 2026-04-23 14:57:58.891278596 +0000 UTC m=+66.684631743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-bdftp" (UID: "4ffefc2c-3199-4193-9958-9394681000af") : secret "networking-console-plugin-cert" not found Apr 23 14:57:44.006559 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.006523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fjp4b" event={"ID":"0b89e5a1-6854-4d53-971b-35ff7c61be50","Type":"ContainerStarted","Data":"a194d820a18da3bce2ce41fb2fc08553dad38305338c30bd0aaaceab5897c101"} Apr 23 14:57:44.008192 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.008169 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stmr6" event={"ID":"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8","Type":"ContainerStarted","Data":"7b77bfca497f54c59e4f0c3479ffe08fbadcdb04767ffdcc8a0bb7b905f666ed"} Apr 23 14:57:44.008192 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.008194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stmr6" event={"ID":"fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8","Type":"ContainerStarted","Data":"8b3722d3d55cbd15fa3cbf678e32174d3208fc95ac198c6f3770873f30c2c556"} Apr 23 14:57:44.008326 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.008306 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:44.034815 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.034759 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fjp4b" podStartSLOduration=17.508576493 podStartE2EDuration="20.034744984s" podCreationTimestamp="2026-04-23 14:57:24 +0000 UTC" firstStartedPulling="2026-04-23 14:57:40.75461608 +0000 UTC m=+48.547969227" lastFinishedPulling="2026-04-23 14:57:43.280784563 +0000 UTC m=+51.074137718" observedRunningTime="2026-04-23 14:57:44.034087535 +0000 UTC m=+51.827440702" watchObservedRunningTime="2026-04-23 14:57:44.034744984 +0000 UTC m=+51.828098151" Apr 23 14:57:44.052423 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.052383 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-stmr6" podStartSLOduration=17.496603742 podStartE2EDuration="20.052370039s" podCreationTimestamp="2026-04-23 14:57:24 +0000 UTC" firstStartedPulling="2026-04-23 14:57:40.720381075 +0000 UTC m=+48.513734218" lastFinishedPulling="2026-04-23 14:57:43.276147368 +0000 UTC m=+51.069500515" observedRunningTime="2026-04-23 14:57:44.051703302 +0000 UTC m=+51.845056481" watchObservedRunningTime="2026-04-23 14:57:44.052370039 +0000 UTC m=+51.845723204" Apr 23 14:57:44.964910 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.964864 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:44.964910 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.964902 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:57:44.965315 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:44.965298 2569 scope.go:117] "RemoveContainer" containerID="1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8" Apr 23 14:57:44.965485 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:57:44.965468 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-94tsp_openshift-console-operator(8b31c555-d556-43a6-ab35-99fdd111e5a5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" podUID="8b31c555-d556-43a6-ab35-99fdd111e5a5" Apr 23 14:57:50.905998 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:50.905972 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wgqdj" Apr 23 14:57:54.012590 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.012558 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-stmr6" Apr 23 14:57:54.858952 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.858917 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2"] Apr 23 14:57:54.901136 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.901087 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h"] Apr 23 14:57:54.901290 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.901227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:54.904938 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.904911 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 14:57:54.905094 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.904991 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 14:57:54.906524 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.906506 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lh55q\"" Apr 23 14:57:54.909346 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.909329 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 14:57:54.910982 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.910964 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 14:57:54.919061 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.919036 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r"] Apr 23 14:57:54.919160 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.919086 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" Apr 23 14:57:54.921681 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.921613 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:54.921681 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.921630 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-txt5r\"" Apr 23 14:57:54.921853 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.921810 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:54.936265 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.936245 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h"] Apr 23 14:57:54.936368 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.936358 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.939306 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.939276 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 14:57:54.939426 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.939408 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 14:57:54.939669 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.939651 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 14:57:54.939743 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.939702 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 14:57:54.960093 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.960066 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2"] Apr 23 14:57:54.960236 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.960199 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:54.962878 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.962855 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 14:57:54.978698 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2"] Apr 23 14:57:54.978698 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978702 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h"] Apr 23 14:57:54.978831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978712 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h"] Apr 23 14:57:54.978831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978719 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r"] Apr 23 14:57:54.978831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978730 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2"] Apr 23 14:57:54.978831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978746 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4"] Apr 23 14:57:54.978831 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.978817 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:54.985616 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.985594 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:54.985944 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.985929 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-z6csn\"" Apr 23 14:57:54.986025 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.985935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 14:57:54.990750 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvkj\" (UniqueName: \"kubernetes.io/projected/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-kube-api-access-sqvkj\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.990844 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.990844 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.990929 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.990929 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.990929 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzwl4\" (UniqueName: \"kubernetes.io/projected/f2978812-2e84-4b8a-8e49-dc1cbff4e316-kube-api-access-tzwl4\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:54.990929 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:54.991063 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990935 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f2978812-2e84-4b8a-8e49-dc1cbff4e316-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:54.991063 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.990951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d4cj\" (UniqueName: \"kubernetes.io/projected/01df512f-cd3b-4a38-8f33-2a3002d931ad-kube-api-access-7d4cj\") pod \"volume-data-source-validator-7c6cbb6c87-7885h\" (UID: \"01df512f-cd3b-4a38-8f33-2a3002d931ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" Apr 23 14:57:54.991797 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:54.991781 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:55.011625 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.011603 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxhr9"] Apr 23 14:57:55.011772 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.011750 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.021388 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.021369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 14:57:55.021873 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.021855 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-45svp\"" Apr 23 14:57:55.022183 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.022161 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 14:57:55.022406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.022381 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 14:57:55.023588 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.023574 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 14:57:55.026413 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.026397 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b45597d46-j8vck"] Apr 23 14:57:55.026560 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.026542 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.038513 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.038491 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4"] Apr 23 14:57:55.038513 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.038510 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxhr9"] Apr 23 14:57:55.038639 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.038519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b45597d46-j8vck"] Apr 23 14:57:55.038639 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.038626 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.040798 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.040780 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 14:57:55.040982 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.040962 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 14:57:55.041075 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.040967 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 14:57:55.041075 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.041047 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 14:57:55.041219 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.041080 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-z79td\"" Apr 23 14:57:55.041508 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.041491 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sh6mh\"" Apr 23 14:57:55.042060 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042040 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 14:57:55.042176 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042040 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 14:57:55.042176 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042140 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 14:57:55.042613 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042551 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 14:57:55.042613 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 14:57:55.042763 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.042619 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 14:57:55.071116 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.071081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 14:57:55.080558 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.080536 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst"] Apr 23 14:57:55.091863 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.091837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvkj\" (UniqueName: \"kubernetes.io/projected/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-kube-api-access-sqvkj\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.091973 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.091868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlfq\" (UniqueName: \"kubernetes.io/projected/74963a76-1095-4633-aaed-e3687b7d7235-kube-api-access-swlfq\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.091973 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.091895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.091973 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.091918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-tmp\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.091973 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.091944 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.092168 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c323d479-2889-4b77-9580-23359b689f90-klusterlet-config\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.092168 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2psz\" (UniqueName: \"kubernetes.io/projected/c323d479-2889-4b77-9580-23359b689f90-kube-api-access-b2psz\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.092246 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092183 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.092246 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092202 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74963a76-1095-4633-aaed-e3687b7d7235-service-ca-bundle\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.092246 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092224 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.092246 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.092406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.092406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092301 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-metrics-certs\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.092406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-snapshots\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.092406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab27f30-a65a-421c-a94f-8664b3f00bbc-serving-cert\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.092406 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzwl4\" (UniqueName: \"kubernetes.io/projected/f2978812-2e84-4b8a-8e49-dc1cbff4e316-kube-api-access-tzwl4\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:55.092634 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.092634 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.092634 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092531 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-stats-auth\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.092634 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f2978812-2e84-4b8a-8e49-dc1cbff4e316-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:55.092634 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d4cj\" (UniqueName: \"kubernetes.io/projected/01df512f-cd3b-4a38-8f33-2a3002d931ad-kube-api-access-7d4cj\") pod \"volume-data-source-validator-7c6cbb6c87-7885h\" (UID: \"01df512f-cd3b-4a38-8f33-2a3002d931ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092737 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst"] Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092861 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggsn\" (UniqueName: \"kubernetes.io/projected/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-kube-api-access-cggsn\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pb9f\" (UniqueName: \"kubernetes.io/projected/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-kube-api-access-5pb9f\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092932 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dzjt\" (UniqueName: \"kubernetes.io/projected/1ab27f30-a65a-421c-a94f-8664b3f00bbc-kube-api-access-8dzjt\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.092990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.093005 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.093006 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c323d479-2889-4b77-9580-23359b689f90-tmp\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.093510 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.093058 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-default-certificate\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.094854 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.094830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-ca\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.094965 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.094884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.095390 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.095372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f2978812-2e84-4b8a-8e49-dc1cbff4e316-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:55.095390 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.095374 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.096026 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.095896 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 14:57:55.096026 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.095908 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-n4ps5\"" Apr 23 14:57:55.096026 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.095931 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:55.096229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.096057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.106748 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.106731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzwl4\" (UniqueName: \"kubernetes.io/projected/f2978812-2e84-4b8a-8e49-dc1cbff4e316-kube-api-access-tzwl4\") pod \"managed-serviceaccount-addon-agent-9cf77885-8q5d2\" (UID: \"f2978812-2e84-4b8a-8e49-dc1cbff4e316\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:55.119494 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.119468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d4cj\" (UniqueName: \"kubernetes.io/projected/01df512f-cd3b-4a38-8f33-2a3002d931ad-kube-api-access-7d4cj\") pod \"volume-data-source-validator-7c6cbb6c87-7885h\" (UID: \"01df512f-cd3b-4a38-8f33-2a3002d931ad\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" Apr 23 14:57:55.119621 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.119603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvkj\" (UniqueName: \"kubernetes.io/projected/9bd29e85-7dd4-452d-b9c1-1fa81de42b0e-kube-api-access-sqvkj\") pod \"cluster-proxy-proxy-agent-76c98fcdbd-xqd7r\" (UID: \"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.193490 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dzjt\" (UniqueName: \"kubernetes.io/projected/1ab27f30-a65a-421c-a94f-8664b3f00bbc-kube-api-access-8dzjt\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.193490 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.193732 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c323d479-2889-4b77-9580-23359b689f90-tmp\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.193732 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193539 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-default-certificate\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.193836 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swlfq\" (UniqueName: \"kubernetes.io/projected/74963a76-1095-4633-aaed-e3687b7d7235-kube-api-access-swlfq\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.193836 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-tmp\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.193836 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.193989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c323d479-2889-4b77-9580-23359b689f90-klusterlet-config\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.193989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2psz\" (UniqueName: \"kubernetes.io/projected/c323d479-2889-4b77-9580-23359b689f90-kube-api-access-b2psz\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.193989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.193989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74963a76-1095-4633-aaed-e3687b7d7235-service-ca-bundle\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.193989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.193955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c323d479-2889-4b77-9580-23359b689f90-tmp\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-metrics-certs\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-snapshots\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab27f30-a65a-421c-a94f-8664b3f00bbc-serving-cert\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-stats-auth\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-tmp\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.194280 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cggsn\" (UniqueName: \"kubernetes.io/projected/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-kube-api-access-cggsn\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.194716 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.194716 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdlsd\" (UniqueName: \"kubernetes.io/projected/aaf6d463-ca39-434f-8c5f-8c0376af8d82-kube-api-access-rdlsd\") pod \"migrator-74bb7799d9-qgmst\" (UID: \"aaf6d463-ca39-434f-8c5f-8c0376af8d82\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" Apr 23 14:57:55.194716 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pb9f\" (UniqueName: \"kubernetes.io/projected/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-kube-api-access-5pb9f\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.194716 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.194625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74963a76-1095-4633-aaed-e3687b7d7235-service-ca-bundle\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.195362 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.195334 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.195552 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.195523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1ab27f30-a65a-421c-a94f-8664b3f00bbc-snapshots\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.195728 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.195704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ab27f30-a65a-421c-a94f-8664b3f00bbc-service-ca-bundle\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.197059 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.197190 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-stats-auth\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.197264 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-default-certificate\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.197378 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.197580 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74963a76-1095-4633-aaed-e3687b7d7235-metrics-certs\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.197838 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ab27f30-a65a-421c-a94f-8664b3f00bbc-serving-cert\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.197910 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.197897 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c323d479-2889-4b77-9580-23359b689f90-klusterlet-config\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.202379 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.202360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dzjt\" (UniqueName: \"kubernetes.io/projected/1ab27f30-a65a-421c-a94f-8664b3f00bbc-kube-api-access-8dzjt\") pod \"insights-operator-585dfdc468-xxhr9\" (UID: \"1ab27f30-a65a-421c-a94f-8664b3f00bbc\") " pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.209289 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.209268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swlfq\" (UniqueName: \"kubernetes.io/projected/74963a76-1095-4633-aaed-e3687b7d7235-kube-api-access-swlfq\") pod \"router-default-7b45597d46-j8vck\" (UID: \"74963a76-1095-4633-aaed-e3687b7d7235\") " pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.209389 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.209354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pb9f\" (UniqueName: \"kubernetes.io/projected/d47dcf67-0b9f-4953-b0ed-f4bbd07274f3-kube-api-access-5pb9f\") pod \"cluster-samples-operator-6dc5bdb6b4-k7th2\" (UID: \"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.209450 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.209412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2psz\" (UniqueName: \"kubernetes.io/projected/c323d479-2889-4b77-9580-23359b689f90-kube-api-access-b2psz\") pod \"klusterlet-addon-workmgr-84865c7d88-mzj5h\" (UID: \"c323d479-2889-4b77-9580-23359b689f90\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.210626 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.210608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggsn\" (UniqueName: \"kubernetes.io/projected/cfa4e428-dfaf-4150-9c9d-cbb0efd324e8-kube-api-access-cggsn\") pod \"cluster-monitoring-operator-75587bd455-vfgk4\" (UID: \"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.220369 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.220354 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" Apr 23 14:57:55.228345 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.228328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" Apr 23 14:57:55.246044 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.246019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" Apr 23 14:57:55.287147 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.284595 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:57:55.290279 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.290252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" Apr 23 14:57:55.295420 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.295170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdlsd\" (UniqueName: \"kubernetes.io/projected/aaf6d463-ca39-434f-8c5f-8c0376af8d82-kube-api-access-rdlsd\") pod \"migrator-74bb7799d9-qgmst\" (UID: \"aaf6d463-ca39-434f-8c5f-8c0376af8d82\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" Apr 23 14:57:55.304376 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.304354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdlsd\" (UniqueName: \"kubernetes.io/projected/aaf6d463-ca39-434f-8c5f-8c0376af8d82-kube-api-access-rdlsd\") pod \"migrator-74bb7799d9-qgmst\" (UID: \"aaf6d463-ca39-434f-8c5f-8c0376af8d82\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" Apr 23 14:57:55.319657 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.319241 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" Apr 23 14:57:55.351809 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.348672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:55.351809 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.349250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xxhr9" Apr 23 14:57:55.410204 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.410144 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2"] Apr 23 14:57:55.419966 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.419632 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" Apr 23 14:57:55.437239 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.436699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h"] Apr 23 14:57:55.453137 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.453084 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01df512f_cd3b_4a38_8f33_2a3002d931ad.slice/crio-ca210a404de0aa9d309f4c7ef195ef777ea227a7d92b5c4fd78a688a60d4bb51 WatchSource:0}: Error finding container ca210a404de0aa9d309f4c7ef195ef777ea227a7d92b5c4fd78a688a60d4bb51: Status 404 returned error can't find the container with id ca210a404de0aa9d309f4c7ef195ef777ea227a7d92b5c4fd78a688a60d4bb51 Apr 23 14:57:55.482492 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.476760 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r"] Apr 23 14:57:55.498673 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.498619 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bd29e85_7dd4_452d_b9c1_1fa81de42b0e.slice/crio-fd5fa11e12c661a857420cb48cea188d80b16b9239090ff08f15540ff4f63999 WatchSource:0}: Error finding container fd5fa11e12c661a857420cb48cea188d80b16b9239090ff08f15540ff4f63999: Status 404 returned error can't find the container with id fd5fa11e12c661a857420cb48cea188d80b16b9239090ff08f15540ff4f63999 Apr 23 14:57:55.506605 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.505959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2"] Apr 23 14:57:55.535933 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.535879 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h"] Apr 23 14:57:55.548204 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.545521 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc323d479_2889_4b77_9580_23359b689f90.slice/crio-83639c8a4a24eb096a6af21000b851ff453e691c04dff63589e6e1755d55e70d WatchSource:0}: Error finding container 83639c8a4a24eb096a6af21000b851ff453e691c04dff63589e6e1755d55e70d: Status 404 returned error can't find the container with id 83639c8a4a24eb096a6af21000b851ff453e691c04dff63589e6e1755d55e70d Apr 23 14:57:55.578470 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.578437 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4"] Apr 23 14:57:55.587765 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.587732 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa4e428_dfaf_4150_9c9d_cbb0efd324e8.slice/crio-dab073a6611e52f848487f09fb87a884d25fa0eecc37f6c3ce099b9a07faec11 WatchSource:0}: Error finding container dab073a6611e52f848487f09fb87a884d25fa0eecc37f6c3ce099b9a07faec11: Status 404 returned error can't find the container with id dab073a6611e52f848487f09fb87a884d25fa0eecc37f6c3ce099b9a07faec11 Apr 23 14:57:55.596546 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.596516 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xxhr9"] Apr 23 14:57:55.599332 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.599303 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab27f30_a65a_421c_a94f_8664b3f00bbc.slice/crio-4b7e7b19dbaa14ac596ea7963f960e9bc4dd4d9e14d0476fc17ffb37b1b33225 WatchSource:0}: Error finding container 4b7e7b19dbaa14ac596ea7963f960e9bc4dd4d9e14d0476fc17ffb37b1b33225: Status 404 returned error can't find the container with id 4b7e7b19dbaa14ac596ea7963f960e9bc4dd4d9e14d0476fc17ffb37b1b33225 Apr 23 14:57:55.611060 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.611038 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b45597d46-j8vck"] Apr 23 14:57:55.614009 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.613977 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74963a76_1095_4633_aaed_e3687b7d7235.slice/crio-3c8abd2da467c23d7f2030bdf3a81e10421e9e87f14ea6a710e2f3f219a90083 WatchSource:0}: Error finding container 3c8abd2da467c23d7f2030bdf3a81e10421e9e87f14ea6a710e2f3f219a90083: Status 404 returned error can't find the container with id 3c8abd2da467c23d7f2030bdf3a81e10421e9e87f14ea6a710e2f3f219a90083 Apr 23 14:57:55.629844 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:55.629817 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst"] Apr 23 14:57:55.632552 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:57:55.632522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf6d463_ca39_434f_8c5f_8c0376af8d82.slice/crio-0b81a72bfbc145a823fce1b2a498d925b2773a01628615ad5b147f932fe738c2 WatchSource:0}: Error finding container 0b81a72bfbc145a823fce1b2a498d925b2773a01628615ad5b147f932fe738c2: Status 404 returned error can't find the container with id 0b81a72bfbc145a823fce1b2a498d925b2773a01628615ad5b147f932fe738c2 Apr 23 14:57:56.045461 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.045378 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" event={"ID":"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8","Type":"ContainerStarted","Data":"dab073a6611e52f848487f09fb87a884d25fa0eecc37f6c3ce099b9a07faec11"} Apr 23 14:57:56.046567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.046519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxhr9" event={"ID":"1ab27f30-a65a-421c-a94f-8664b3f00bbc","Type":"ContainerStarted","Data":"4b7e7b19dbaa14ac596ea7963f960e9bc4dd4d9e14d0476fc17ffb37b1b33225"} Apr 23 14:57:56.048015 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.047949 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" event={"ID":"01df512f-cd3b-4a38-8f33-2a3002d931ad","Type":"ContainerStarted","Data":"ca210a404de0aa9d309f4c7ef195ef777ea227a7d92b5c4fd78a688a60d4bb51"} Apr 23 14:57:56.049868 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.049820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b45597d46-j8vck" event={"ID":"74963a76-1095-4633-aaed-e3687b7d7235","Type":"ContainerStarted","Data":"081bb1d16d22fe6c6fcdefb2cc04ee40ec647d5aa4c84f19aaa63623e0ca8b78"} Apr 23 14:57:56.049868 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.049850 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b45597d46-j8vck" event={"ID":"74963a76-1095-4633-aaed-e3687b7d7235","Type":"ContainerStarted","Data":"3c8abd2da467c23d7f2030bdf3a81e10421e9e87f14ea6a710e2f3f219a90083"} Apr 23 14:57:56.051140 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.051091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" event={"ID":"f2978812-2e84-4b8a-8e49-dc1cbff4e316","Type":"ContainerStarted","Data":"4a8c333e88165b888f650e373f7385bb9fc8801bda8460e54c23a97190a504e0"} Apr 23 14:57:56.052376 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.052336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" event={"ID":"aaf6d463-ca39-434f-8c5f-8c0376af8d82","Type":"ContainerStarted","Data":"0b81a72bfbc145a823fce1b2a498d925b2773a01628615ad5b147f932fe738c2"} Apr 23 14:57:56.053546 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.053501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" event={"ID":"c323d479-2889-4b77-9580-23359b689f90","Type":"ContainerStarted","Data":"83639c8a4a24eb096a6af21000b851ff453e691c04dff63589e6e1755d55e70d"} Apr 23 14:57:56.054717 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.054693 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" event={"ID":"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3","Type":"ContainerStarted","Data":"bb610125d4cbad03c3352df1ed3dfc82bacfd12a6f4466931428f12445c371ad"} Apr 23 14:57:56.056089 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.056063 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" event={"ID":"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e","Type":"ContainerStarted","Data":"fd5fa11e12c661a857420cb48cea188d80b16b9239090ff08f15540ff4f63999"} Apr 23 14:57:56.074474 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.074425 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b45597d46-j8vck" podStartSLOduration=2.074408016 podStartE2EDuration="2.074408016s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:56.073400782 +0000 UTC m=+63.866753949" watchObservedRunningTime="2026-04-23 14:57:56.074408016 +0000 UTC m=+63.867761182" Apr 23 14:57:56.349746 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.349709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:56.352671 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:56.352424 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:57.069263 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:57.069165 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:57.071596 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:57.071572 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b45597d46-j8vck" Apr 23 14:57:58.528121 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.528051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:58.531370 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.531312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36b18c47-1676-4fda-b4e6-a7a9acee20a9-metrics-certs\") pod \"network-metrics-daemon-x9gsg\" (UID: \"36b18c47-1676-4fda-b4e6-a7a9acee20a9\") " pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:58.676977 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.676946 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qqjqd\"" Apr 23 14:57:58.684646 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.684619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x9gsg" Apr 23 14:57:58.763036 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.763000 2569 scope.go:117] "RemoveContainer" containerID="1cff9993367c83e0d29d476453d8bcb81ed48004de289e313856087fd9f249e8" Apr 23 14:57:58.931479 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.931437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:58.933982 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:58.933950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4ffefc2c-3199-4193-9958-9394681000af-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-bdftp\" (UID: \"4ffefc2c-3199-4193-9958-9394681000af\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:57:59.172883 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:59.172674 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mccj5\"" Apr 23 14:57:59.179493 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:57:59.179456 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" Apr 23 14:58:00.516093 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:00.516054 2569 patch_prober.go:28] interesting pod/image-registry-6cdd955d68-f2g47 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 14:58:00.516555 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:00.516135 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 14:58:02.006014 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:02.005986 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:58:03.067365 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.066737 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x9gsg"] Apr 23 14:58:03.098457 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.097655 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 14:58:03.098457 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.097752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" event={"ID":"8b31c555-d556-43a6-ab35-99fdd111e5a5","Type":"ContainerStarted","Data":"454dc57ef9ca1ec3eae4e6da1f7dc4ba80efba6027e36798c1f06df8b2d65bc1"} Apr 23 14:58:03.100368 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.099994 2569 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-94tsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.9:8443/readyz\": dial tcp 10.132.0.9:8443: connect: connection refused" start-of-body= Apr 23 14:58:03.100368 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.100035 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" podUID="8b31c555-d556-43a6-ab35-99fdd111e5a5" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.9:8443/readyz\": dial tcp 10.132.0.9:8443: connect: connection refused" Apr 23 14:58:03.100368 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.100151 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:58:03.102951 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.102811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-bdftp"] Apr 23 14:58:03.103343 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.103313 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x9gsg" event={"ID":"36b18c47-1676-4fda-b4e6-a7a9acee20a9","Type":"ContainerStarted","Data":"16561b1de8ebafd0567ba40f67b3cf61ff7f899a7fb65691b3042cb2bf6f0aa2"} Apr 23 14:58:03.113057 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.112982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" event={"ID":"cfa4e428-dfaf-4150-9c9d-cbb0efd324e8","Type":"ContainerStarted","Data":"8448dcd3c8a8d24b66d2acc2af65fe2f01f1cbd6b126a39b18c36bc632d46578"} Apr 23 14:58:03.115082 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.115011 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" event={"ID":"01df512f-cd3b-4a38-8f33-2a3002d931ad","Type":"ContainerStarted","Data":"2e73570bbedfb1413812d2c9e22d562789d7f19c39dfa6fbb3c3dccbb700d72d"} Apr 23 14:58:03.123863 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.117290 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" podStartSLOduration=61.369429222 podStartE2EDuration="1m10.117275434s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:57:25.152633199 +0000 UTC m=+32.945986344" lastFinishedPulling="2026-04-23 14:57:33.900479389 +0000 UTC m=+41.693832556" observedRunningTime="2026-04-23 14:58:03.116475095 +0000 UTC m=+70.909828261" watchObservedRunningTime="2026-04-23 14:58:03.117275434 +0000 UTC m=+70.910628599" Apr 23 14:58:03.169554 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.169279 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7885h" podStartSLOduration=1.8033098600000002 podStartE2EDuration="9.16926034s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.460862193 +0000 UTC m=+63.254215341" lastFinishedPulling="2026-04-23 14:58:02.826812663 +0000 UTC m=+70.620165821" observedRunningTime="2026-04-23 14:58:03.167640476 +0000 UTC m=+70.960993642" watchObservedRunningTime="2026-04-23 14:58:03.16926034 +0000 UTC m=+70.962613506" Apr 23 14:58:03.169554 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:03.169428 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vfgk4" podStartSLOduration=1.932744046 podStartE2EDuration="9.169422776s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.590198968 +0000 UTC m=+63.383552112" lastFinishedPulling="2026-04-23 14:58:02.826877699 +0000 UTC m=+70.620230842" observedRunningTime="2026-04-23 14:58:03.148718533 +0000 UTC m=+70.942071695" watchObservedRunningTime="2026-04-23 14:58:03.169422776 +0000 UTC m=+70.962775937" Apr 23 14:58:04.127330 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.127206 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" event={"ID":"f2978812-2e84-4b8a-8e49-dc1cbff4e316","Type":"ContainerStarted","Data":"c2822c5d2ad1fe86de2b838a6692113ae651fdd3d72ed0d115d76eb68a67dda2"} Apr 23 14:58:04.129292 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.129239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" event={"ID":"4ffefc2c-3199-4193-9958-9394681000af","Type":"ContainerStarted","Data":"aa87a3f80e712a747d1c1595c4125bde4023d2b2128ff11d0464b21235ffe0a9"} Apr 23 14:58:04.133334 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.133058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" event={"ID":"aaf6d463-ca39-434f-8c5f-8c0376af8d82","Type":"ContainerStarted","Data":"f0a4f7767d371f5af0f751b49937eece8aee284c0dffc930aaf1e7fa884729d4"} Apr 23 14:58:04.133334 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.133089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" event={"ID":"aaf6d463-ca39-434f-8c5f-8c0376af8d82","Type":"ContainerStarted","Data":"3280dde6e6a9930d0b91f7046b6522f6f18fdaa3d3a4e11401cfe727cf50cb49"} Apr 23 14:58:04.135622 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.135190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" event={"ID":"c323d479-2889-4b77-9580-23359b689f90","Type":"ContainerStarted","Data":"fd4c6af51f8850ee4a53b95b48b7a8ea76a3f50ba76bb1161c6ae3cca86fb564"} Apr 23 14:58:04.135622 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.135483 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:58:04.138642 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.138612 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" Apr 23 14:58:04.140182 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.139679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" event={"ID":"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3","Type":"ContainerStarted","Data":"f7d2526dc69bc60c3388e672e22ebd293c7778a730dc8ba715a0ca2d6a4c961a"} Apr 23 14:58:04.140182 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.139706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" event={"ID":"d47dcf67-0b9f-4953-b0ed-f4bbd07274f3","Type":"ContainerStarted","Data":"c896f8020bc97409f3714769cc1dcc4f73bb625cd9710dac3c4b51eee97a1097"} Apr 23 14:58:04.146296 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.146271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" event={"ID":"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e","Type":"ContainerStarted","Data":"19ab121396d769141df0729f22cd3ad3db9112965771294ff11e63c01e22252f"} Apr 23 14:58:04.150309 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.150094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xxhr9" event={"ID":"1ab27f30-a65a-421c-a94f-8664b3f00bbc","Type":"ContainerStarted","Data":"31fd62cdee3c7a9a42092aa5c295fa2847cdaf18f26581d7bd59bc6bf4fc08f7"} Apr 23 14:58:04.155840 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.155802 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-94tsp" Apr 23 14:58:04.164997 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.164945 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-9cf77885-8q5d2" podStartSLOduration=2.675014096 podStartE2EDuration="10.164930812s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.42769029 +0000 UTC m=+63.221043440" lastFinishedPulling="2026-04-23 14:58:02.917606999 +0000 UTC m=+70.710960156" observedRunningTime="2026-04-23 14:58:04.14504554 +0000 UTC m=+71.938398712" watchObservedRunningTime="2026-04-23 14:58:04.164930812 +0000 UTC m=+71.958283977" Apr 23 14:58:04.166229 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.165207 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84865c7d88-mzj5h" podStartSLOduration=2.776012645 podStartE2EDuration="10.165199755s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.547741162 +0000 UTC m=+63.341094319" lastFinishedPulling="2026-04-23 14:58:02.936928279 +0000 UTC m=+70.730281429" observedRunningTime="2026-04-23 14:58:04.163296461 +0000 UTC m=+71.956649626" watchObservedRunningTime="2026-04-23 14:58:04.165199755 +0000 UTC m=+71.958552920" Apr 23 14:58:04.183318 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.183265 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qgmst" podStartSLOduration=1.993089801 podStartE2EDuration="9.183249788s" podCreationTimestamp="2026-04-23 14:57:55 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.636653228 +0000 UTC m=+63.430006371" lastFinishedPulling="2026-04-23 14:58:02.826813211 +0000 UTC m=+70.620166358" observedRunningTime="2026-04-23 14:58:04.182072975 +0000 UTC m=+71.975426141" watchObservedRunningTime="2026-04-23 14:58:04.183249788 +0000 UTC m=+71.976602953" Apr 23 14:58:04.202636 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.202577 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-k7th2" podStartSLOduration=2.935481637 podStartE2EDuration="10.202558533s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.559795397 +0000 UTC m=+63.353148556" lastFinishedPulling="2026-04-23 14:58:02.826872301 +0000 UTC m=+70.620225452" observedRunningTime="2026-04-23 14:58:04.200588204 +0000 UTC m=+71.993941369" watchObservedRunningTime="2026-04-23 14:58:04.202558533 +0000 UTC m=+71.995911698" Apr 23 14:58:04.241809 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:04.241746 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xxhr9" podStartSLOduration=2.927605095 podStartE2EDuration="10.241727293s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.601114142 +0000 UTC m=+63.394467288" lastFinishedPulling="2026-04-23 14:58:02.915236339 +0000 UTC m=+70.708589486" observedRunningTime="2026-04-23 14:58:04.241701477 +0000 UTC m=+72.035054644" watchObservedRunningTime="2026-04-23 14:58:04.241727293 +0000 UTC m=+72.035080459" Apr 23 14:58:05.487269 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:05.487239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-stmr6_fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8/dns/0.log" Apr 23 14:58:05.665476 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:05.665450 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-stmr6_fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8/kube-rbac-proxy/0.log" Apr 23 14:58:05.983226 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:05.983199 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6w94x" Apr 23 14:58:06.157623 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.157495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" event={"ID":"4ffefc2c-3199-4193-9958-9394681000af","Type":"ContainerStarted","Data":"aa9f783b4f0578434b79ceba59ed2d1bbeca09de11b81de2948a6f7c36b16d7c"} Apr 23 14:58:06.159384 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.159344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" event={"ID":"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e","Type":"ContainerStarted","Data":"e462c10fa710c1f52f920af098c3bc817c9d520349c00d03cefa2e3c7e83c3c6"} Apr 23 14:58:06.159520 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.159388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" event={"ID":"9bd29e85-7dd4-452d-b9c1-1fa81de42b0e","Type":"ContainerStarted","Data":"6df3c6dfc4e785036ecabebb66a8eb29789be8782ddd06b1cac08715f76d1d27"} Apr 23 14:58:06.160857 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.160837 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x9gsg" event={"ID":"36b18c47-1676-4fda-b4e6-a7a9acee20a9","Type":"ContainerStarted","Data":"a7f463f3db16b7ded859e4fd018a2e35d4b7b8836f7be2819226f160fcd55c2f"} Apr 23 14:58:06.160915 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.160862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x9gsg" event={"ID":"36b18c47-1676-4fda-b4e6-a7a9acee20a9","Type":"ContainerStarted","Data":"f05ad558a6c573cd8749dcf032f0de5ee0ddce0db51b952d190c358c3859299f"} Apr 23 14:58:06.175746 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.175702 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-bdftp" podStartSLOduration=36.971696337 podStartE2EDuration="39.175690068s" podCreationTimestamp="2026-04-23 14:57:27 +0000 UTC" firstStartedPulling="2026-04-23 14:58:03.11244289 +0000 UTC m=+70.905796037" lastFinishedPulling="2026-04-23 14:58:05.316436625 +0000 UTC m=+73.109789768" observedRunningTime="2026-04-23 14:58:06.174034256 +0000 UTC m=+73.967387422" watchObservedRunningTime="2026-04-23 14:58:06.175690068 +0000 UTC m=+73.969043267" Apr 23 14:58:06.196650 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.195364 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76c98fcdbd-xqd7r" podStartSLOduration=2.305066333 podStartE2EDuration="12.195347101s" podCreationTimestamp="2026-04-23 14:57:54 +0000 UTC" firstStartedPulling="2026-04-23 14:57:55.503085272 +0000 UTC m=+63.296438432" lastFinishedPulling="2026-04-23 14:58:05.393366043 +0000 UTC m=+73.186719200" observedRunningTime="2026-04-23 14:58:06.192708824 +0000 UTC m=+73.986061988" watchObservedRunningTime="2026-04-23 14:58:06.195347101 +0000 UTC m=+73.988700266" Apr 23 14:58:06.210010 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.209853 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x9gsg" podStartSLOduration=70.981052034 podStartE2EDuration="1m13.209839022s" podCreationTimestamp="2026-04-23 14:56:53 +0000 UTC" firstStartedPulling="2026-04-23 14:58:03.089960975 +0000 UTC m=+70.883314133" lastFinishedPulling="2026-04-23 14:58:05.318747978 +0000 UTC m=+73.112101121" observedRunningTime="2026-04-23 14:58:06.209523615 +0000 UTC m=+74.002876781" watchObservedRunningTime="2026-04-23 14:58:06.209839022 +0000 UTC m=+74.003192186" Apr 23 14:58:06.464062 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.464031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4665j_f947ddb6-797b-4afb-a2cf-6c8c70291f6d/dns-node-resolver/0.log" Apr 23 14:58:06.864673 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:06.864647 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b45597d46-j8vck_74963a76-1095-4633-aaed-e3687b7d7235/router/0.log" Apr 23 14:58:07.270203 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:07.270124 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fjp4b_0b89e5a1-6854-4d53-971b-35ff7c61be50/serve-healthcheck-canary/0.log" Apr 23 14:58:07.667360 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:07.667336 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qgmst_aaf6d463-ca39-434f-8c5f-8c0376af8d82/migrator/0.log" Apr 23 14:58:07.866585 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:07.866551 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qgmst_aaf6d463-ca39-434f-8c5f-8c0376af8d82/graceful-termination/0.log" Apr 23 14:58:08.066535 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:08.066458 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b7nsn_5ee82d4e-8f5a-4e80-bed4-26f905b2deef/kube-storage-version-migrator-operator/0.log" Apr 23 14:58:12.037408 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.037379 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bbztr"] Apr 23 14:58:12.042009 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.041988 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.044585 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.044568 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 14:58:12.045921 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.045901 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4ln87\"" Apr 23 14:58:12.046113 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.045966 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 14:58:12.053366 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.053342 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bbztr"] Apr 23 14:58:12.151687 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.151648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95190234-70d8-4586-96fc-940b33508aa1-crio-socket\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.151687 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.151698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95190234-70d8-4586-96fc-940b33508aa1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.151954 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.151736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95190234-70d8-4586-96fc-940b33508aa1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.151954 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.151765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95190234-70d8-4586-96fc-940b33508aa1-data-volume\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.151954 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.151815 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbxq\" (UniqueName: \"kubernetes.io/projected/95190234-70d8-4586-96fc-940b33508aa1-kube-api-access-bkbxq\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.202788 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.202747 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-b8dvb"] Apr 23 14:58:12.206823 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.206800 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.210256 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.210237 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 14:58:12.210366 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.210277 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 14:58:12.210565 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.210544 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x6ghm\"" Apr 23 14:58:12.210671 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.210633 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 14:58:12.210736 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.210610 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 14:58:12.252851 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95190234-70d8-4586-96fc-940b33508aa1-crio-socket\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.252851 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95190234-70d8-4586-96fc-940b33508aa1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253086 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252900 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95190234-70d8-4586-96fc-940b33508aa1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253086 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95190234-70d8-4586-96fc-940b33508aa1-data-volume\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253086 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbxq\" (UniqueName: \"kubernetes.io/projected/95190234-70d8-4586-96fc-940b33508aa1-kube-api-access-bkbxq\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253086 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.252954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/95190234-70d8-4586-96fc-940b33508aa1-crio-socket\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253569 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.253544 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/95190234-70d8-4586-96fc-940b33508aa1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.253719 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.253599 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/95190234-70d8-4586-96fc-940b33508aa1-data-volume\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.255340 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.255318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/95190234-70d8-4586-96fc-940b33508aa1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.274756 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.274726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbxq\" (UniqueName: \"kubernetes.io/projected/95190234-70d8-4586-96fc-940b33508aa1-kube-api-access-bkbxq\") pod \"insights-runtime-extractor-bbztr\" (UID: \"95190234-70d8-4586-96fc-940b33508aa1\") " pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.353825 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.353791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-textfile\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354007 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.353849 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-accelerators-collector-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354007 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.353904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354007 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.353928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354177 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.354024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-metrics-client-ca\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354177 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.354069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-root\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354177 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.354111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-wtmp\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354177 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.354147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvzh\" (UniqueName: \"kubernetes.io/projected/93f275de-2255-4067-b78d-3c2eba50e847-kube-api-access-7hvzh\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.354347 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.354182 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-sys\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.357423 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.357400 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bbztr" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-metrics-client-ca\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-root\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-wtmp\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvzh\" (UniqueName: \"kubernetes.io/projected/93f275de-2255-4067-b78d-3c2eba50e847-kube-api-access-7hvzh\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-sys\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-textfile\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-accelerators-collector-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.455831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457172 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.457170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-metrics-client-ca\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457748 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.457237 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-root\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457748 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.457352 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-wtmp\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457748 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.457574 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-textfile\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.457748 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.457633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f275de-2255-4067-b78d-3c2eba50e847-sys\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.459686 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.458073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-accelerators-collector-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.459686 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:58:12.458201 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 14:58:12.459686 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:58:12.458261 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls podName:93f275de-2255-4067-b78d-3c2eba50e847 nodeName:}" failed. No retries permitted until 2026-04-23 14:58:12.958242054 +0000 UTC m=+80.751595213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls") pod "node-exporter-b8dvb" (UID: "93f275de-2255-4067-b78d-3c2eba50e847") : secret "node-exporter-tls" not found Apr 23 14:58:12.460924 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.460878 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.481004 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.480945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvzh\" (UniqueName: \"kubernetes.io/projected/93f275de-2255-4067-b78d-3c2eba50e847-kube-api-access-7hvzh\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.567570 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.567541 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bbztr"] Apr 23 14:58:12.569738 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:58:12.569710 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95190234_70d8_4586_96fc_940b33508aa1.slice/crio-dcdc355321d00d00ba46c911fd72a989d0739bc6d1e2899a407b7551289b4f3d WatchSource:0}: Error finding container dcdc355321d00d00ba46c911fd72a989d0739bc6d1e2899a407b7551289b4f3d: Status 404 returned error can't find the container with id dcdc355321d00d00ba46c911fd72a989d0739bc6d1e2899a407b7551289b4f3d Apr 23 14:58:12.960015 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.959939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:12.962174 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:12.962151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93f275de-2255-4067-b78d-3c2eba50e847-node-exporter-tls\") pod \"node-exporter-b8dvb\" (UID: \"93f275de-2255-4067-b78d-3c2eba50e847\") " pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:13.118538 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:13.118499 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-b8dvb" Apr 23 14:58:13.184141 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:13.184030 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbztr" event={"ID":"95190234-70d8-4586-96fc-940b33508aa1","Type":"ContainerStarted","Data":"220cb21671bfdd8fb63415df4ca5731cc4fb22b43138428b0a1172fbd11b10e4"} Apr 23 14:58:13.184141 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:13.184074 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbztr" event={"ID":"95190234-70d8-4586-96fc-940b33508aa1","Type":"ContainerStarted","Data":"dcdc355321d00d00ba46c911fd72a989d0739bc6d1e2899a407b7551289b4f3d"} Apr 23 14:58:13.185573 ip-10-0-143-199 kubenswrapper[2569]: W0423 14:58:13.185547 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f275de_2255_4067_b78d_3c2eba50e847.slice/crio-04fc5116a3e31be5b0cfeec77f3eb7375826e7fd898b815f77b7651a18fdb145 WatchSource:0}: Error finding container 04fc5116a3e31be5b0cfeec77f3eb7375826e7fd898b815f77b7651a18fdb145: Status 404 returned error can't find the container with id 04fc5116a3e31be5b0cfeec77f3eb7375826e7fd898b815f77b7651a18fdb145 Apr 23 14:58:14.189193 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:14.189072 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbztr" event={"ID":"95190234-70d8-4586-96fc-940b33508aa1","Type":"ContainerStarted","Data":"5a2b050d1538287ff6c8acedc6050ca8f487972c0c97a2a8b6897c1c8c33be6b"} Apr 23 14:58:14.190731 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:14.190700 2569 generic.go:358] "Generic (PLEG): container finished" podID="93f275de-2255-4067-b78d-3c2eba50e847" containerID="b0ad2d982b7b0c97bf2f1823a0ba88652f8e2fe6e847efeb5d09944d9af3e303" exitCode=0 Apr 23 14:58:14.190917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:14.190766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b8dvb" event={"ID":"93f275de-2255-4067-b78d-3c2eba50e847","Type":"ContainerDied","Data":"b0ad2d982b7b0c97bf2f1823a0ba88652f8e2fe6e847efeb5d09944d9af3e303"} Apr 23 14:58:14.190917 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:14.190795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b8dvb" event={"ID":"93f275de-2255-4067-b78d-3c2eba50e847","Type":"ContainerStarted","Data":"04fc5116a3e31be5b0cfeec77f3eb7375826e7fd898b815f77b7651a18fdb145"} Apr 23 14:58:15.196343 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:15.196244 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bbztr" event={"ID":"95190234-70d8-4586-96fc-940b33508aa1","Type":"ContainerStarted","Data":"774ea736ce45296083ffbdf75ab88f45fa6f64372661f4c7b82311aca245836d"} Apr 23 14:58:15.198569 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:15.198544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b8dvb" event={"ID":"93f275de-2255-4067-b78d-3c2eba50e847","Type":"ContainerStarted","Data":"578002a2abf54dc8f4a89831e0577ac6121202173a37bb15053162f8ae11f0bb"} Apr 23 14:58:15.198696 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:15.198576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-b8dvb" event={"ID":"93f275de-2255-4067-b78d-3c2eba50e847","Type":"ContainerStarted","Data":"756cf8ce13f362c02e315121ca80a1ada8c1867e77910438fce9fb9e1351fff5"} Apr 23 14:58:15.218736 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:15.218685 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bbztr" podStartSLOduration=0.966471332 podStartE2EDuration="3.218639848s" podCreationTimestamp="2026-04-23 14:58:12 +0000 UTC" firstStartedPulling="2026-04-23 14:58:12.634874743 +0000 UTC m=+80.428227887" lastFinishedPulling="2026-04-23 14:58:14.887043257 +0000 UTC m=+82.680396403" observedRunningTime="2026-04-23 14:58:15.217851072 +0000 UTC m=+83.011204249" watchObservedRunningTime="2026-04-23 14:58:15.218639848 +0000 UTC m=+83.011993015" Apr 23 14:58:15.239707 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:15.239650 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-b8dvb" podStartSLOduration=2.503803666 podStartE2EDuration="3.239633257s" podCreationTimestamp="2026-04-23 14:58:12 +0000 UTC" firstStartedPulling="2026-04-23 14:58:13.187218924 +0000 UTC m=+80.980572068" lastFinishedPulling="2026-04-23 14:58:13.923048502 +0000 UTC m=+81.716401659" observedRunningTime="2026-04-23 14:58:15.237986727 +0000 UTC m=+83.031339894" watchObservedRunningTime="2026-04-23 14:58:15.239633257 +0000 UTC m=+83.032986423" Apr 23 14:58:22.519673 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:22.519636 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:58:40.269704 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:40.269664 2569 generic.go:358] "Generic (PLEG): container finished" podID="5ee82d4e-8f5a-4e80-bed4-26f905b2deef" containerID="a7c6c5fa822a34c6ce62a5dfe4eeb27b16c55c6ab74c40bbbf4c48235f69bd7d" exitCode=0 Apr 23 14:58:40.270134 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:40.269736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" event={"ID":"5ee82d4e-8f5a-4e80-bed4-26f905b2deef","Type":"ContainerDied","Data":"a7c6c5fa822a34c6ce62a5dfe4eeb27b16c55c6ab74c40bbbf4c48235f69bd7d"} Apr 23 14:58:40.270134 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:40.270060 2569 scope.go:117] "RemoveContainer" containerID="a7c6c5fa822a34c6ce62a5dfe4eeb27b16c55c6ab74c40bbbf4c48235f69bd7d" Apr 23 14:58:41.274669 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:41.274633 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b7nsn" event={"ID":"5ee82d4e-8f5a-4e80-bed4-26f905b2deef","Type":"ContainerStarted","Data":"6c69efdb5ea64560ac847b3f10786547bccb6fbe22ec12d712cf279451b57012"} Apr 23 14:58:47.543389 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.543342 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerName="registry" containerID="cri-o://88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a" gracePeriod=30 Apr 23 14:58:47.774084 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.774062 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:58:47.840277 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840250 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840285 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840313 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840419 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840384 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840427 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4fwg\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840463 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840567 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840520 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840825 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840800 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca\") pod \"14dc1b3f-026b-4412-831d-49a1eaa9b470\" (UID: \"14dc1b3f-026b-4412-831d-49a1eaa9b470\") " Apr 23 14:58:47.840925 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.840827 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:58:47.841090 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.841068 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-certificates\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.841417 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.841392 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:58:47.842873 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.842844 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg" (OuterVolumeSpecName: "kube-api-access-j4fwg") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "kube-api-access-j4fwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:58:47.842987 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.842918 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:58:47.842987 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.842926 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:58:47.843127 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.843026 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:58:47.843202 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.843158 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:58:47.848980 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.848954 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14dc1b3f-026b-4412-831d-49a1eaa9b470" (UID: "14dc1b3f-026b-4412-831d-49a1eaa9b470"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:58:47.942304 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942271 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14dc1b3f-026b-4412-831d-49a1eaa9b470-ca-trust-extracted\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942304 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942297 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14dc1b3f-026b-4412-831d-49a1eaa9b470-trusted-ca\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942304 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942307 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-registry-tls\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942304 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942315 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-bound-sa-token\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942559 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942326 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-installation-pull-secrets\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942559 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942334 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4fwg\" (UniqueName: \"kubernetes.io/projected/14dc1b3f-026b-4412-831d-49a1eaa9b470-kube-api-access-j4fwg\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:47.942559 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:47.942343 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14dc1b3f-026b-4412-831d-49a1eaa9b470-image-registry-private-configuration\") on node \"ip-10-0-143-199.ec2.internal\" DevicePath \"\"" Apr 23 14:58:48.298499 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.298416 2569 generic.go:358] "Generic (PLEG): container finished" podID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerID="88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a" exitCode=0 Apr 23 14:58:48.298499 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.298464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" event={"ID":"14dc1b3f-026b-4412-831d-49a1eaa9b470","Type":"ContainerDied","Data":"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a"} Apr 23 14:58:48.298499 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.298486 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" Apr 23 14:58:48.298734 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.298507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cdd955d68-f2g47" event={"ID":"14dc1b3f-026b-4412-831d-49a1eaa9b470","Type":"ContainerDied","Data":"e98889a56edd66cfc11cf50823585555e02bc4e7052bbd73884a25b3caa1fb5f"} Apr 23 14:58:48.298734 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.298523 2569 scope.go:117] "RemoveContainer" containerID="88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a" Apr 23 14:58:48.306408 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.306387 2569 scope.go:117] "RemoveContainer" containerID="88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a" Apr 23 14:58:48.306671 ip-10-0-143-199 kubenswrapper[2569]: E0423 14:58:48.306653 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a\": container with ID starting with 88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a not found: ID does not exist" containerID="88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a" Apr 23 14:58:48.306737 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.306679 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a"} err="failed to get container status \"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a\": rpc error: code = NotFound desc = could not find container \"88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a\": container with ID starting with 88f58c1823c57ad2390f7a296ac4f48139d6dcfeea07c1e85b23c5bc90dca78a not found: ID does not exist" Apr 23 14:58:48.325148 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.325121 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:58:48.339989 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.339966 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6cdd955d68-f2g47"] Apr 23 14:58:48.771350 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:58:48.771310 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" path="/var/lib/kubelet/pods/14dc1b3f-026b-4412-831d-49a1eaa9b470/volumes" Apr 23 14:59:06.347544 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:59:06.347511 2569 generic.go:358] "Generic (PLEG): container finished" podID="b6fcac8f-1e41-4ff0-8acb-e3115431f3e9" containerID="dd927d18dce0852965c4a9aa2f8274d3273bb654678a866bd2361e4a5cee73a5" exitCode=0 Apr 23 14:59:06.347948 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:59:06.347587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" event={"ID":"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9","Type":"ContainerDied","Data":"dd927d18dce0852965c4a9aa2f8274d3273bb654678a866bd2361e4a5cee73a5"} Apr 23 14:59:06.347948 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:59:06.347874 2569 scope.go:117] "RemoveContainer" containerID="dd927d18dce0852965c4a9aa2f8274d3273bb654678a866bd2361e4a5cee73a5" Apr 23 14:59:07.352283 ip-10-0-143-199 kubenswrapper[2569]: I0423 14:59:07.352250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g2wkp" event={"ID":"b6fcac8f-1e41-4ff0-8acb-e3115431f3e9","Type":"ContainerStarted","Data":"6a1d96a8535c0db2c3e8cb310cb2c6300817b4f5a4b6597dd894e4d9bf56c6d7"} Apr 23 15:01:52.696468 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:01:52.696440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:01:52.696975 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:01:52.696440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:01:52.702259 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:01:52.702238 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:01:52.702259 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:01:52.702251 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:01:52.706802 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:01:52.706781 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 15:06:52.720229 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:06:52.720205 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:06:52.720759 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:06:52.720740 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:06:52.730546 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:06:52.730519 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:06:52.730970 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:06:52.730955 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:11:52.744468 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:11:52.744387 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:11:52.746861 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:11:52.746841 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:11:52.752319 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:11:52.752299 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:11:52.760014 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:11:52.759989 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:16:52.774147 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:16:52.774094 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:16:52.775959 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:16:52.775936 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:16:52.779050 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:16:52.779031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:16:52.780415 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:16:52.780397 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:21:52.793427 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:21:52.793397 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:21:52.795833 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:21:52.795252 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:21:52.799034 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:21:52.798860 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:21:52.805286 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:21:52.805259 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:26:52.818672 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:26:52.818562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:26:52.825389 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:26:52.820743 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:26:52.825389 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:26:52.823734 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:26:52.825668 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:26:52.825648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:31:52.838581 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:31:52.838551 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:31:52.841038 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:31:52.841022 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:31:52.843875 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:31:52.843858 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:31:52.846252 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:31:52.846230 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:36:52.864174 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:36:52.864053 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:36:52.868093 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:36:52.867219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:36:52.869022 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:36:52.869005 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:36:52.871929 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:36:52.871914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:41:52.883496 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:41:52.883399 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:41:52.887571 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:41:52.886972 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:41:52.888358 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:41:52.888331 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:41:52.891673 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:41:52.891658 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:46:52.903227 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:46:52.903087 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:46:52.907039 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:46:52.906825 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:46:52.908420 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:46:52.908404 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:46:52.911996 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:46:52.911979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:48:16.705771 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:16.705739 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-whs7j_dce2b511-080e-42a7-a345-5e616c565b84/global-pull-secret-syncer/0.log" Apr 23 15:48:16.842510 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:16.842479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h6s52_838ed924-f431-4482-a37d-5836e1570c45/konnectivity-agent/0.log" Apr 23 15:48:16.917869 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:16.917821 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-199.ec2.internal_bff84cbb0dcf4f847d21e0c35b203751/haproxy/0.log" Apr 23 15:48:20.256351 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:20.256273 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vfgk4_cfa4e428-dfaf-4150-9c9d-cbb0efd324e8/cluster-monitoring-operator/0.log" Apr 23 15:48:20.426743 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:20.426694 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b8dvb_93f275de-2255-4067-b78d-3c2eba50e847/node-exporter/0.log" Apr 23 15:48:20.450684 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:20.450657 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b8dvb_93f275de-2255-4067-b78d-3c2eba50e847/kube-rbac-proxy/0.log" Apr 23 15:48:20.473242 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:20.473221 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-b8dvb_93f275de-2255-4067-b78d-3c2eba50e847/init-textfile/0.log" Apr 23 15:48:22.387881 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:22.387853 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-bdftp_4ffefc2c-3199-4193-9958-9394681000af/networking-console-plugin/0.log" Apr 23 15:48:22.802363 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:22.802272 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/1.log" Apr 23 15:48:22.807903 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:22.807871 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-94tsp_8b31c555-d556-43a6-ab35-99fdd111e5a5/console-operator/2.log" Apr 23 15:48:23.571904 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.571869 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9"] Apr 23 15:48:23.572324 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.572229 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerName="registry" Apr 23 15:48:23.572324 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.572241 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerName="registry" Apr 23 15:48:23.572324 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.572293 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="14dc1b3f-026b-4412-831d-49a1eaa9b470" containerName="registry" Apr 23 15:48:23.575234 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.575218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.577822 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.577801 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-75czz\"/\"openshift-service-ca.crt\"" Apr 23 15:48:23.578739 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.578718 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-75czz\"/\"default-dockercfg-6s449\"" Apr 23 15:48:23.578834 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.578728 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-75czz\"/\"kube-root-ca.crt\"" Apr 23 15:48:23.584775 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.584751 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9"] Apr 23 15:48:23.641456 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.641426 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7885h_01df512f-cd3b-4a38-8f33-2a3002d931ad/volume-data-source-validator/0.log" Apr 23 15:48:23.737648 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.737614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-podres\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.737648 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.737655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-sys\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.737857 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.737674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-lib-modules\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.737857 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.737733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-proc\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.737857 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.737791 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njgff\" (UniqueName: \"kubernetes.io/projected/a33649e4-620d-48dc-a92f-cea0ce738311-kube-api-access-njgff\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838696 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838670 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-podres\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838880 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-sys\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838880 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-lib-modules\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838880 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838789 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-sys\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838880 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-proc\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.838880 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-podres\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.839085 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838894 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-lib-modules\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.839085 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njgff\" (UniqueName: \"kubernetes.io/projected/a33649e4-620d-48dc-a92f-cea0ce738311-kube-api-access-njgff\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.839085 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.838931 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a33649e4-620d-48dc-a92f-cea0ce738311-proc\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.848401 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.848383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njgff\" (UniqueName: \"kubernetes.io/projected/a33649e4-620d-48dc-a92f-cea0ce738311-kube-api-access-njgff\") pod \"perf-node-gather-daemonset-kq2v9\" (UID: \"a33649e4-620d-48dc-a92f-cea0ce738311\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:23.886123 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:23.886067 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:24.006706 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.006679 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9"] Apr 23 15:48:24.009384 ip-10-0-143-199 kubenswrapper[2569]: W0423 15:48:24.009356 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda33649e4_620d_48dc_a92f_cea0ce738311.slice/crio-a8713f8675e2951d37ff6dce501a2cc601fa5da18a6b697c460ff1e93808baeb WatchSource:0}: Error finding container a8713f8675e2951d37ff6dce501a2cc601fa5da18a6b697c460ff1e93808baeb: Status 404 returned error can't find the container with id a8713f8675e2951d37ff6dce501a2cc601fa5da18a6b697c460ff1e93808baeb Apr 23 15:48:24.010933 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.010914 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:48:24.381406 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.381329 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-stmr6_fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8/dns/0.log" Apr 23 15:48:24.402795 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.402770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-stmr6_fb0ff8bf-abfa-447e-b6cb-c0cac33c7ea8/kube-rbac-proxy/0.log" Apr 23 15:48:24.489903 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.489877 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4665j_f947ddb6-797b-4afb-a2cf-6c8c70291f6d/dns-node-resolver/0.log" Apr 23 15:48:24.527616 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.527581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" event={"ID":"a33649e4-620d-48dc-a92f-cea0ce738311","Type":"ContainerStarted","Data":"d944c52d6ab99df5a2904d21692d7cd455f108e44ecf7c83a3cd1e8140f2195c"} Apr 23 15:48:24.527616 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.527618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" event={"ID":"a33649e4-620d-48dc-a92f-cea0ce738311","Type":"ContainerStarted","Data":"a8713f8675e2951d37ff6dce501a2cc601fa5da18a6b697c460ff1e93808baeb"} Apr 23 15:48:24.527826 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.527700 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:24.544268 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:24.544223 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" podStartSLOduration=1.544205583 podStartE2EDuration="1.544205583s" podCreationTimestamp="2026-04-23 15:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 15:48:24.542532841 +0000 UTC m=+3092.335886019" watchObservedRunningTime="2026-04-23 15:48:24.544205583 +0000 UTC m=+3092.337558749" Apr 23 15:48:25.009023 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:25.008995 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bvjj4_2d6354ca-7ee1-46d6-b36a-0a9af17e4cc8/node-ca/0.log" Apr 23 15:48:25.779708 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:25.779675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b45597d46-j8vck_74963a76-1095-4633-aaed-e3687b7d7235/router/0.log" Apr 23 15:48:26.153473 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:26.153447 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fjp4b_0b89e5a1-6854-4d53-971b-35ff7c61be50/serve-healthcheck-canary/0.log" Apr 23 15:48:26.523247 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:26.523168 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xxhr9_1ab27f30-a65a-421c-a94f-8664b3f00bbc/insights-operator/0.log" Apr 23 15:48:26.683076 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:26.683043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbztr_95190234-70d8-4586-96fc-940b33508aa1/kube-rbac-proxy/0.log" Apr 23 15:48:26.704331 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:26.704303 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbztr_95190234-70d8-4586-96fc-940b33508aa1/exporter/0.log" Apr 23 15:48:26.725756 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:26.725733 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bbztr_95190234-70d8-4586-96fc-940b33508aa1/extractor/0.log" Apr 23 15:48:30.542671 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:30.542642 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-kq2v9" Apr 23 15:48:31.499764 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:31.499732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qgmst_aaf6d463-ca39-434f-8c5f-8c0376af8d82/migrator/0.log" Apr 23 15:48:31.522609 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:31.522573 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qgmst_aaf6d463-ca39-434f-8c5f-8c0376af8d82/graceful-termination/0.log" Apr 23 15:48:31.818887 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:31.818814 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b7nsn_5ee82d4e-8f5a-4e80-bed4-26f905b2deef/kube-storage-version-migrator-operator/1.log" Apr 23 15:48:31.819823 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:31.819807 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b7nsn_5ee82d4e-8f5a-4e80-bed4-26f905b2deef/kube-storage-version-migrator-operator/0.log" Apr 23 15:48:33.056688 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.056660 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/kube-multus-additional-cni-plugins/0.log" Apr 23 15:48:33.079349 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.079321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/egress-router-binary-copy/0.log" Apr 23 15:48:33.100953 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.100922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/cni-plugins/0.log" Apr 23 15:48:33.121782 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.121760 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/bond-cni-plugin/0.log" Apr 23 15:48:33.146866 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.146846 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/routeoverride-cni/0.log" Apr 23 15:48:33.167555 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.167533 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/whereabouts-cni-bincopy/0.log" Apr 23 15:48:33.190001 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.189979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jnxqk_25a96c44-032a-41ac-8ed7-c051e3666b8c/whereabouts-cni/0.log" Apr 23 15:48:33.221474 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.221449 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4q5x_b6751e3c-d6ad-40a0-9acc-87ab97b33923/kube-multus/0.log" Apr 23 15:48:33.328854 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.328781 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x9gsg_36b18c47-1676-4fda-b4e6-a7a9acee20a9/network-metrics-daemon/0.log" Apr 23 15:48:33.352286 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:33.352259 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x9gsg_36b18c47-1676-4fda-b4e6-a7a9acee20a9/kube-rbac-proxy/0.log" Apr 23 15:48:34.965642 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:34.965616 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-controller/0.log" Apr 23 15:48:34.988458 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:34.988430 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/0.log" Apr 23 15:48:35.002332 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.002312 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovn-acl-logging/1.log" Apr 23 15:48:35.020564 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.020543 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/kube-rbac-proxy-node/0.log" Apr 23 15:48:35.047914 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.047892 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 15:48:35.070218 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.070197 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/northd/0.log" Apr 23 15:48:35.093516 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.093496 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/nbdb/0.log" Apr 23 15:48:35.118543 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.118528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/sbdb/0.log" Apr 23 15:48:35.214332 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:35.214307 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wgqdj_a2c6ef5a-00fe-42a6-a297-8b67bf27ea78/ovnkube-controller/0.log" Apr 23 15:48:36.224718 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:36.224693 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4mhjz_23156db7-0193-437f-a4c9-bc8b886b91e9/check-endpoints/0.log" Apr 23 15:48:36.274559 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:36.274532 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6w94x_56883bac-9d1b-41b2-a97d-66c4b6485777/network-check-target-container/0.log" Apr 23 15:48:37.229658 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:37.229631 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-84b9q_8849fada-dac8-4194-8755-b18e59197a97/iptables-alerter/0.log" Apr 23 15:48:38.035208 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:38.035183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7bjrs_5ec98e0e-ddcd-4e32-9712-870091ed1acb/tuned/0.log" Apr 23 15:48:39.810638 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:39.810607 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k7th2_d47dcf67-0b9f-4953-b0ed-f4bbd07274f3/cluster-samples-operator/0.log" Apr 23 15:48:39.829572 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:39.829546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-k7th2_d47dcf67-0b9f-4953-b0ed-f4bbd07274f3/cluster-samples-operator-watch/0.log" Apr 23 15:48:40.791235 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:40.791205 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-g2wkp_b6fcac8f-1e41-4ff0-8acb-e3115431f3e9/service-ca-operator/1.log" Apr 23 15:48:40.792162 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:40.792142 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-g2wkp_b6fcac8f-1e41-4ff0-8acb-e3115431f3e9/service-ca-operator/0.log" Apr 23 15:48:41.141041 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:41.141013 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-xl547_8c0d3f6a-dc3f-468c-b4e4-20234fca5855/service-ca-controller/0.log" Apr 23 15:48:41.580958 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:41.580890 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9wnzd_2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f/csi-driver/0.log" Apr 23 15:48:41.604768 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:41.604739 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9wnzd_2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f/csi-node-driver-registrar/0.log" Apr 23 15:48:41.626447 ip-10-0-143-199 kubenswrapper[2569]: I0423 15:48:41.626422 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9wnzd_2e7dfccb-2c1b-4a01-a837-596ab4bc2e8f/csi-liveness-probe/0.log"