Apr 28 19:13:56.295050 ip-10-0-140-230 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 28 19:13:56.295062 ip-10-0-140-230 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 28 19:13:56.295070 ip-10-0-140-230 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 28 19:13:56.295316 ip-10-0-140-230 systemd[1]: Failed to start Kubernetes Kubelet. Apr 28 19:14:06.474449 ip-10-0-140-230 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 28 19:14:06.474471 ip-10-0-140-230 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4f183a39702d48278d25e14a7a2689af -- Apr 28 19:16:20.446527 ip-10-0-140-230 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:16:20.901195 ip-10-0-140-230 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:20.901195 ip-10-0-140-230 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:16:20.901195 ip-10-0-140-230 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:20.901195 ip-10-0-140-230 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:16:20.901195 ip-10-0-140-230 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:16:20.902714 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.902567 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:16:20.905993 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.905975 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.905998 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906003 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906006 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906019 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906022 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906024 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906027 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906030 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906032 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906035 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906038 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906040 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906043 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906045 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:20.906043 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906048 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906052 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906055 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906057 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906060 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906062 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906065 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906068 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906070 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906073 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906075 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906078 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906081 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906083 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906086 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906088 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906091 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906093 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906096 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:20.906431 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906098 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906101 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906104 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906112 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906115 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906118 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906120 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906124 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906126 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906129 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906131 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906133 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906136 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906138 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906141 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906144 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906146 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906149 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906151 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906154 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:20.906977 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906157 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906159 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906162 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906164 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906167 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906170 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906172 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906175 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906177 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906179 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906182 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906184 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906187 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906189 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906192 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906195 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906215 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906219 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906221 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:20.907510 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906224 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906229 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906233 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906236 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906239 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906242 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906245 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906247 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906252 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906256 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906259 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906262 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:20.908239 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.906264 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:20.908780 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908763 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:20.908780 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908779 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908784 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908787 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908790 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908793 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908796 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908798 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908801 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908804 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908807 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908810 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908813 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908815 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908818 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908821 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908823 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908826 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908829 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908832 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908834 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:20.908833 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908837 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908840 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908843 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908847 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908850 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908852 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908855 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908858 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908860 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908864 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908867 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908870 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908872 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908875 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908877 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908880 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908882 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908885 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908887 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908890 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:20.909368 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908892 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908895 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908898 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908901 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908903 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908906 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908908 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908911 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908913 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908916 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908918 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908921 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908923 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908926 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908930 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908932 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908935 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908937 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908941 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908944 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:20.909877 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908946 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908949 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908951 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908954 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908957 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908959 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908962 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908964 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908967 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908969 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908974 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908978 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908981 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908984 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908988 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908991 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.908994 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909005 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909008 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:20.910389 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909011 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909014 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909016 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909019 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909022 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909024 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909104 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909113 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909120 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909132 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909137 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909141 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909145 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909150 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909153 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909156 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909160 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909163 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909166 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909169 2572 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909172 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909175 2572 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909178 2572 flags.go:64] FLAG: --cloud-config="" Apr 28 19:16:20.910857 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909181 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909184 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909188 2572 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909191 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909195 2572 flags.go:64] FLAG: --config-dir="" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909212 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909216 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909224 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909236 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909239 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909242 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909245 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909249 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909252 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909255 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909259 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909264 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909268 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909271 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909274 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909276 2572 flags.go:64] FLAG: --enable-server="true" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909279 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909295 2572 flags.go:64] FLAG: --event-burst="100" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909299 2572 flags.go:64] FLAG: --event-qps="50" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909302 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:16:20.911442 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909305 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909308 2572 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909315 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909318 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909321 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909324 2572 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909327 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909330 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909333 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909336 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909339 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909342 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909345 2572 flags.go:64] FLAG: --feature-gates="" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909349 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909352 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909355 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909364 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909367 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909371 2572 flags.go:64] FLAG: --help="false" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909374 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-140-230.ec2.internal" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909377 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909381 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909384 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909387 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:16:20.912059 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909391 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909394 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909396 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909399 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909402 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909405 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909409 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909412 2572 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909415 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909417 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909420 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909423 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909426 2572 flags.go:64] FLAG: --lock-file="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909429 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909432 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909435 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909440 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909443 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909446 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909449 2572 flags.go:64] FLAG: --logging-format="text" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909452 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909455 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909458 2572 flags.go:64] FLAG: --manifest-url="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909461 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909466 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:16:20.912672 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909475 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909480 2572 flags.go:64] FLAG: --max-pods="110" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909483 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909486 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909489 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909492 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909495 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909498 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909501 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909509 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909513 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909516 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909519 2572 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909521 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909527 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909530 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909533 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909536 2572 flags.go:64] FLAG: --port="10250" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909539 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909542 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-058dadf30012d998e" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909545 2572 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909548 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909550 2572 flags.go:64] FLAG: --register-node="true" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909554 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:16:20.913343 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909557 2572 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909560 2572 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909563 2572 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909566 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909569 2572 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909573 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909576 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909579 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909582 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909590 2572 flags.go:64] FLAG: --runonce="false" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909594 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909597 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909600 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909603 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909606 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909609 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909612 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909615 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909618 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909621 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909624 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909626 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909629 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909632 2572 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909635 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:16:20.913913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909641 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909644 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909646 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909651 2572 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909654 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909656 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909662 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909665 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909668 2572 flags.go:64] FLAG: --v="2" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909673 2572 flags.go:64] FLAG: --version="false" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909677 2572 flags.go:64] FLAG: --vmodule="" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909682 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.909685 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909782 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909786 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909789 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909792 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909803 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909807 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909810 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909813 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909816 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:20.914532 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909823 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909825 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909828 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909831 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909833 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909836 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909839 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909841 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909844 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909847 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909849 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909852 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909854 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909857 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909859 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909862 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909870 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909873 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909876 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909878 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909881 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:20.915098 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909884 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909887 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909890 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909892 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909895 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909897 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909901 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909911 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909913 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909916 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909919 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909921 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909924 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909926 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909929 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909931 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909933 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909936 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909938 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:20.915658 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909941 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909943 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909946 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909949 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909952 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909954 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909957 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909961 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909965 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909968 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909971 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909973 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909976 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909978 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909981 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909983 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909986 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909988 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909991 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:20.916145 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909994 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.909997 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910005 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910007 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910010 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910013 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910015 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910018 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910020 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910023 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910025 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910028 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910030 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910033 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910035 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910038 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910040 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:20.916706 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.910043 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:20.917161 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.910706 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:20.919257 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.919236 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:16:20.919257 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.919257 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:16:20.919327 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919307 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:20.919327 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919312 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:20.919327 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919320 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:20.919327 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919324 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:20.919327 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919327 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919330 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919334 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919337 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919340 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919343 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919345 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919348 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919351 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919354 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919357 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919360 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919362 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919365 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919367 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919370 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919373 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919375 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919378 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919380 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:20.919458 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919383 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919386 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919388 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919391 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919393 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919396 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919398 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919401 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919403 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919406 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919409 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919411 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919414 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919417 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919420 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919422 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919425 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919428 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919430 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919433 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:20.919971 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919435 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919438 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919442 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919444 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919447 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919450 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919452 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919454 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919457 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919459 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919462 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919465 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919468 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919470 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919473 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919475 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919478 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919480 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919483 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919485 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:20.920557 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919488 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919492 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919496 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919499 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919502 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919505 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919508 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919511 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919513 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919516 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919520 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919524 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919527 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919529 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919532 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919535 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919538 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919540 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:20.921057 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919543 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919545 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919548 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919550 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.919556 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919660 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919665 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919668 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919671 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919673 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919676 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919678 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919681 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919683 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919686 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:16:20.921547 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919689 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919692 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919695 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919698 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919700 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919703 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919706 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919708 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919711 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919714 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919716 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919718 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919721 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919724 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919728 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919732 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919735 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919737 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919740 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919743 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:16:20.921930 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919745 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919748 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919750 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919753 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919755 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919758 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919761 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919763 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919766 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919768 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919770 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919773 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919775 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919778 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919781 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919783 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919786 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919788 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919791 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919794 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:16:20.922485 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919796 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919799 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919801 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919804 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919806 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919809 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919811 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919814 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919818 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919821 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919824 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919826 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919829 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919832 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919835 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919837 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919840 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919843 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919846 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:16:20.922983 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919848 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919851 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919853 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919856 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919858 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919861 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919864 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919866 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919869 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919872 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919874 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919877 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919879 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919882 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919884 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919887 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:16:20.923477 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:20.919889 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:16:20.923870 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.919894 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:16:20.923870 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.920619 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:16:20.923994 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.923979 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:16:20.925102 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.925090 2572 server.go:1019] "Starting client certificate rotation" Apr 28 19:16:20.925227 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.925184 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:20.925305 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.925263 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:16:20.952648 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.952627 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:20.956901 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.956879 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:16:20.974611 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.974586 2572 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:16:20.983254 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.983231 2572 log.go:25] "Validated CRI v1 image API" Apr 28 19:16:20.984969 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.984948 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:16:20.986297 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.986279 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:20.989619 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.989594 2572 fs.go:135] Filesystem UUIDs: map[284c8921-8df5-4cc2-9f58-8214040f9f3a:/dev/nvme0n1p3 2c87f9b5-0ce0-44cb-9675-08e308fae4a7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 28 19:16:20.989708 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.989617 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:16:20.995576 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.995460 2572 manager.go:217] Machine: {Timestamp:2026-04-28 19:16:20.99341444 +0000 UTC m=+0.420915049 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103913 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22009f4275257e58288ea31ae3d696 SystemUUID:ec22009f-4275-257e-5828-8ea31ae3d696 BootID:4f183a39-702d-4827-8d25-e14a7a2689af Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6b:96:22:f8:dd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6b:96:22:f8:dd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:20:dc:3f:9a:37 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:16:20.995576 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.995570 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:16:20.995722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.995709 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:16:20.996813 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.996780 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:16:20.996965 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.996815 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-230.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:16:20.997014 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.996975 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:16:20.997014 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.996983 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:16:20.997014 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.996996 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:20.998470 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.998457 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:16:20.999356 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.999344 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:20.999652 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:20.999640 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:16:21.003224 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.003208 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:16:21.003271 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.003234 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:16:21.003271 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.003257 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:16:21.003271 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.003271 2572 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:16:21.003356 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.003283 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:16:21.004434 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.004420 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:21.004488 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.004441 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:16:21.007726 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.007708 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:16:21.009170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.009145 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:16:21.011409 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011392 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011415 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011424 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011432 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011441 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011450 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011476 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:16:21.011487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011485 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:16:21.011736 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011496 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:16:21.011736 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011505 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:16:21.011736 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011532 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:16:21.011736 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.011556 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:16:21.013579 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.013560 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79gzg" Apr 28 19:16:21.013712 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.013700 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:16:21.013760 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.013719 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:16:21.014782 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.014759 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:16:21.014782 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.014759 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:16:21.018218 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018190 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:16:21.018323 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018251 2572 server.go:1295] "Started kubelet" Apr 28 19:16:21.018411 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018359 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:16:21.018454 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018382 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:16:21.018454 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018436 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:16:21.018655 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.018636 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-230.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:16:21.019246 ip-10-0-140-230 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:16:21.019698 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.019503 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:16:21.019822 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.019585 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:16:21.024246 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.024219 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-79gzg" Apr 28 19:16:21.026787 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.026762 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:16:21.026787 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.026767 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:21.027585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.027572 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:16:21.027832 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.027820 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:16:21.027934 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.027911 2572 factory.go:55] Registering systemd factory Apr 28 19:16:21.028063 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.028042 2572 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:16:21.028147 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.027922 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:16:21.028374 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.028360 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:16:21.028374 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.028374 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:16:21.028809 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.025142 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-230.ec2.internal.18aa9b525e06e25f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-230.ec2.internal,UID:ip-10-0-140-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-230.ec2.internal,},FirstTimestamp:2026-04-28 19:16:21.018215007 +0000 UTC m=+0.445715618,LastTimestamp:2026-04-28 19:16:21.018215007 +0000 UTC m=+0.445715618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-230.ec2.internal,}" Apr 28 19:16:21.028915 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.028895 2572 factory.go:153] Registering CRI-O factory Apr 28 19:16:21.028968 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.028923 2572 factory.go:223] Registration of the crio container factory successfully Apr 28 19:16:21.029035 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.029022 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:16:21.029100 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.029056 2572 factory.go:103] Registering Raw factory Apr 28 19:16:21.029100 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.029073 2572 manager.go:1196] Started watching for new ooms in manager Apr 28 19:16:21.029706 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.029677 2572 manager.go:319] Starting recovery of all containers Apr 28 19:16:21.029907 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.029888 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.031756 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.031732 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:16:21.032647 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.032627 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:21.038596 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.038362 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-230.ec2.internal\" not found" node="ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.040639 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.040620 2572 manager.go:324] Recovery completed Apr 28 19:16:21.045011 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.044997 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.047489 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.047473 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.047554 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.047503 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.047554 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.047514 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.048012 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.047995 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:16:21.048012 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.048010 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:16:21.048090 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.048029 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:16:21.050484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.050470 2572 policy_none.go:49] "None policy: Start" Apr 28 19:16:21.050535 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.050488 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:16:21.050535 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.050500 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:16:21.090956 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.090941 2572 manager.go:341] "Starting Device Plugin manager" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.091029 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091044 2572 server.go:85] "Starting device plugin registration server" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091308 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091319 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091417 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091497 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.091505 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.092035 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:16:21.114186 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.092069 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.161304 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.161226 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:16:21.162473 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.162456 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:16:21.162540 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.162484 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:16:21.162540 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.162503 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:16:21.162540 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.162509 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:16:21.162668 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.162547 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:16:21.165147 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.165126 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:21.191910 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.191887 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.193174 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.193158 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.193278 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.193191 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.193278 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.193222 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.193278 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.193256 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.201844 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.201829 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.201907 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.201853 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-230.ec2.internal\": node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.218814 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.218787 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.262939 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.262905 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal"] Apr 28 19:16:21.263080 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.262987 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.263930 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.263914 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.264030 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.263959 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.264030 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.263976 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.265245 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265230 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.265370 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.265416 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265386 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.265969 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265955 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.266046 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265967 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.266046 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265981 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.266046 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265991 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.266046 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.265995 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.266046 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.266005 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.267774 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.267751 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.267866 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.267797 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:16:21.269007 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.268991 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:16:21.269098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.269017 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:16:21.269098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.269029 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:16:21.296876 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.296851 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-230.ec2.internal\" not found" node="ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.301238 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.301222 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-230.ec2.internal\" not found" node="ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.319666 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.319646 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.329612 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.329590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.419930 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.419845 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.430296 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.430261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.430296 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.430284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.430448 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.430330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.430448 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.430349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/77fe4b4b44e63783dd24b8c6f6bea437-config\") pod \"kube-apiserver-proxy-ip-10-0-140-230.ec2.internal\" (UID: \"77fe4b4b44e63783dd24b8c6f6bea437\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.520715 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.520685 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.531025 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.531004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.531112 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.531065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/77fe4b4b44e63783dd24b8c6f6bea437-config\") pod \"kube-apiserver-proxy-ip-10-0-140-230.ec2.internal\" (UID: \"77fe4b4b44e63783dd24b8c6f6bea437\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.531112 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.531088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/77fe4b4b44e63783dd24b8c6f6bea437-config\") pod \"kube-apiserver-proxy-ip-10-0-140-230.ec2.internal\" (UID: \"77fe4b4b44e63783dd24b8c6f6bea437\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.531182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.531111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12ae3e9fbde7a26e5bd78fbf1749835b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal\" (UID: \"12ae3e9fbde7a26e5bd78fbf1749835b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.599265 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.599235 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.603825 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.603804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:21.621596 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.621564 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.722279 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.722159 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.822724 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.822693 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.923248 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:21.923194 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:21.925346 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.925324 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:16:21.925472 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.925454 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:21.925508 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:21.925485 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:16:22.024076 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:22.023984 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-230.ec2.internal\" not found" Apr 28 19:16:22.026158 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.026129 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:11:21 +0000 UTC" deadline="2027-12-08 06:12:43.999414495 +0000 UTC" Apr 28 19:16:22.026247 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.026159 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14122h56m21.97325947s" Apr 28 19:16:22.027258 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.027243 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:16:22.054656 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.054632 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:22.055179 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.055161 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:16:22.083530 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.083502 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gk4gs" Apr 28 19:16:22.089497 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.089474 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gk4gs" Apr 28 19:16:22.127196 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.127169 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" Apr 28 19:16:22.140522 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.140498 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:22.142798 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.142776 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" Apr 28 19:16:22.151926 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.151899 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:16:22.161039 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:22.160995 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77fe4b4b44e63783dd24b8c6f6bea437.slice/crio-d2f2c390377a4b4f83234ae2cf5f050f2d04328093b4e4c91585acf7680b4d94 WatchSource:0}: Error finding container d2f2c390377a4b4f83234ae2cf5f050f2d04328093b4e4c91585acf7680b4d94: Status 404 returned error can't find the container with id d2f2c390377a4b4f83234ae2cf5f050f2d04328093b4e4c91585acf7680b4d94 Apr 28 19:16:22.161503 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:22.161484 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ae3e9fbde7a26e5bd78fbf1749835b.slice/crio-6fe613dae8c763a17adc3a80f1cfb26e5030eb46863ae0b0e72c0397b2959a5c WatchSource:0}: Error finding container 6fe613dae8c763a17adc3a80f1cfb26e5030eb46863ae0b0e72c0397b2959a5c: Status 404 returned error can't find the container with id 6fe613dae8c763a17adc3a80f1cfb26e5030eb46863ae0b0e72c0397b2959a5c Apr 28 19:16:22.165699 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.165656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" event={"ID":"12ae3e9fbde7a26e5bd78fbf1749835b","Type":"ContainerStarted","Data":"6fe613dae8c763a17adc3a80f1cfb26e5030eb46863ae0b0e72c0397b2959a5c"} Apr 28 19:16:22.166109 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.166093 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:16:22.166797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.166726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" event={"ID":"77fe4b4b44e63783dd24b8c6f6bea437","Type":"ContainerStarted","Data":"d2f2c390377a4b4f83234ae2cf5f050f2d04328093b4e4c91585acf7680b4d94"} Apr 28 19:16:22.539536 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.539506 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:22.913638 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.913548 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:22.997008 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:22.996975 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:23.004057 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.004037 2572 apiserver.go:52] "Watching apiserver" Apr 28 19:16:23.011065 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.011039 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:16:23.012114 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.012087 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-pkjxs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh","openshift-image-registry/node-ca-4clws","openshift-multus/multus-additional-cni-plugins-sbv79","openshift-multus/multus-h56jw","openshift-network-diagnostics/network-check-target-t8qc5","openshift-network-operator/iptables-alerter-vlrjx","openshift-ovn-kubernetes/ovnkube-node-wv7qh","kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal","openshift-cluster-node-tuning-operator/tuned-v498x","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal","openshift-multus/network-metrics-daemon-ggtvc"] Apr 28 19:16:23.014663 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.014637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.016176 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.016154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.016285 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.016184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.017625 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.017603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.019129 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.018921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.020022 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.020001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:23.020118 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.020087 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:23.021240 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.021221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.022702 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.022340 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wgwbw\"" Apr 28 19:16:23.022986 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.022966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-plvh4\"" Apr 28 19:16:23.023919 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.023269 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.023919 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.023498 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:16:23.023919 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.023551 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.025079 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.025059 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.025529 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.025473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.026119 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026087 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:16:23.026243 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026135 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.026243 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026092 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.026243 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026191 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:16:23.026427 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026286 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.026427 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026308 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m9bsx\"" Apr 28 19:16:23.026427 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026388 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.026607 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026447 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:16:23.026674 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026626 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:16:23.026934 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026914 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pjj6k\"" Apr 28 19:16:23.027009 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026963 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:16:23.027009 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026926 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:16:23.027146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.026987 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.027146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.027097 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.027146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.027121 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.027146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.027124 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lqs9h\"" Apr 28 19:16:23.027146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.027146 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:16:23.027437 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.027164 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:16:23.027437 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.027176 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:23.028637 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.028618 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:16:23.031599 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.031569 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5nmfb\"" Apr 28 19:16:23.032096 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032078 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.032190 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032153 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:16:23.032190 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:16:23.032417 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032394 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:16:23.032495 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032417 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jc7mx\"" Apr 28 19:16:23.032495 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032401 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4xdc4\"" Apr 28 19:16:23.032596 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032504 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.032722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032695 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:16:23.032796 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.032699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:16:23.035766 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.035747 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:16:23.038934 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.038875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-node-log\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.038934 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.038908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-script-lib\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.038937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-modprobe-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.038963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.038980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-system-cni-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039001 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-agent-certs\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-conf\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039084 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-host\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-cni-binary-copy\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-ovn\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-log-socket\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-system-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-etc-kubernetes\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-konnectivity-ca\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-kubelet\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-lib-modules\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-tmp\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrgw4\" (UniqueName: \"kubernetes.io/projected/9136f542-e29a-475e-9ef4-a5653b964224-kube-api-access-xrgw4\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.039408 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039396 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-sys\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-serviceca\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-var-lib-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-cnibin\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039487 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-hostroot\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039545 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37f93754-2c70-4bb8-bd31-698cb86a7f0d-host-slash\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-systemd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-env-overrides\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-etc-selinux\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-os-release\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039808 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mbtg\" (UniqueName: \"kubernetes.io/projected/0a2d2c94-3731-45d6-be85-14a1a081468a-kube-api-access-2mbtg\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-systemd-units\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-slash\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-etc-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.039949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-bin\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-netd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.039984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-kubernetes\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040009 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-conf-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-systemd\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-var-lib-kubelet\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxcn\" (UniqueName: \"kubernetes.io/projected/931659cd-b67a-416b-93b2-fc1c18e4a16e-kube-api-access-nqxcn\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-multus\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-multus-certs\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040251 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-config\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-tuned\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-k8s-cni-cncf-io\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovn-node-metrics-cert\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.040551 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-netns\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7fzx\" (UniqueName: \"kubernetes.io/projected/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-kube-api-access-c7fzx\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-kubelet\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhktr\" (UniqueName: \"kubernetes.io/projected/375f0483-eb31-462b-859e-b59ffc509ba2-kube-api-access-mhktr\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-425fz\" (UniqueName: \"kubernetes.io/projected/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-kube-api-access-425fz\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-cnibin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-socket-dir-parent\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl2sm\" (UniqueName: \"kubernetes.io/projected/37f93754-2c70-4bb8-bd31-698cb86a7f0d-kube-api-access-bl2sm\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-socket-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-os-release\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-bin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-multus-daemon-config\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.041182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysconfig\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-sys-fs\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnz4z\" (UniqueName: \"kubernetes.io/projected/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kube-api-access-gnz4z\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37f93754-2c70-4bb8-bd31-698cb86a7f0d-iptables-alerter-script\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-netns\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.040988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-run\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.041012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-host\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.041034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.041066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-registration-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.041110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-device-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.041933 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.041141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.090422 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.090381 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:22 +0000 UTC" deadline="2027-12-11 04:28:37.149131291 +0000 UTC" Apr 28 19:16:23.090422 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.090413 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14193h12m14.058721656s" Apr 28 19:16:23.141487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-slash\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141487 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-etc-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-bin\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-netd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-kubernetes\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-conf-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-slash\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-systemd\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-netd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-var-lib-kubelet\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-conf-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-kubernetes\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxcn\" (UniqueName: \"kubernetes.io/projected/931659cd-b67a-416b-93b2-fc1c18e4a16e-kube-api-access-nqxcn\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-systemd\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-etc-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141713 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-cni-bin\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.141792 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-var-lib-kubelet\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.142479 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-multus\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-multus-certs\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-config\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.141807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-multus\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-multus-certs\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142602 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-tuned\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-k8s-cni-cncf-io\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovn-node-metrics-cert\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-k8s-cni-cncf-io\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-netns\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7fzx\" (UniqueName: \"kubernetes.io/projected/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-kube-api-access-c7fzx\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-kubelet\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.142785 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhktr\" (UniqueName: \"kubernetes.io/projected/375f0483-eb31-462b-859e-b59ffc509ba2-kube-api-access-mhktr\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-run-netns\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142819 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:16:23.142914 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.142843 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.642823258 +0000 UTC m=+3.070323870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.143296 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-kubelet\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143296 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.142817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-425fz\" (UniqueName: \"kubernetes.io/projected/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-kube-api-access-425fz\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.143394 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.143394 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-cnibin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143495 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-socket-dir-parent\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143495 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl2sm\" (UniqueName: \"kubernetes.io/projected/37f93754-2c70-4bb8-bd31-698cb86a7f0d-kube-api-access-bl2sm\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.143495 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-config\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-cnibin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-socket-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-os-release\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-bin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-multus-daemon-config\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysconfig\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysconfig\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-cni-binary-copy\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-sys-fs\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-host-var-lib-cni-bin\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnz4z\" (UniqueName: \"kubernetes.io/projected/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kube-api-access-gnz4z\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-sys-fs\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-socket-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-multus-socket-dir-parent\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.143949 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37f93754-2c70-4bb8-bd31-698cb86a7f0d-iptables-alerter-script\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-netns\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-os-release\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.143993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-run\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-host\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-registration-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-device-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-node-log\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-script-lib\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-modprobe-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-system-cni-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-agent-certs\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-conf\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144332 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-host\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.144632 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-cni-binary-copy\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-ovn\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-log-socket\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-system-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-multus-daemon-config\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/37f93754-2c70-4bb8-bd31-698cb86a7f0d-iptables-alerter-script\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144482 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-etc-kubernetes\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-konnectivity-ca\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144536 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-kubelet\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-lib-modules\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-modprobe-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-tmp\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrgw4\" (UniqueName: \"kubernetes.io/projected/9136f542-e29a-475e-9ef4-a5653b964224-kube-api-access-xrgw4\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-ovn\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-run-netns\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144720 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-run\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-host\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.145413 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-d\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-system-cni-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-registration-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-device-dir\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-sysctl-conf\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-log-socket\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145523 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-node-log\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.144669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-sys\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-host\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovnkube-script-lib\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-sys\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/931659cd-b67a-416b-93b2-fc1c18e4a16e-lib-modules\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-serviceca\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-serviceca\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-host-kubelet\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.145977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-var-lib-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.146269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-cnibin\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-system-cni-dir\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146032 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-hostroot\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37f93754-2c70-4bb8-bd31-698cb86a7f0d-host-slash\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-systemd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9136f542-e29a-475e-9ef4-a5653b964224-cnibin\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-env-overrides\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-konnectivity-ca\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-etc-selinux\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-etc-kubernetes\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-hostroot\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-os-release\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-run-systemd\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-var-lib-openvswitch\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mbtg\" (UniqueName: \"kubernetes.io/projected/0a2d2c94-3731-45d6-be85-14a1a081468a-kube-api-access-2mbtg\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37f93754-2c70-4bb8-bd31-698cb86a7f0d-host-slash\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-systemd-units\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-etc-selinux\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.147086 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-systemd-units\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-etc-tuned\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/375f0483-eb31-462b-859e-b59ffc509ba2-os-release\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/375f0483-eb31-462b-859e-b59ffc509ba2-cni-binary-copy\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-env-overrides\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-ovn-node-metrics-cert\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.147797 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.146792 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9136f542-e29a-475e-9ef4-a5653b964224-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.148362 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.148340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/931659cd-b67a-416b-93b2-fc1c18e4a16e-tmp\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.148726 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.148706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f8cecbc4-1ccf-49b8-bd5f-a126bd910b04-agent-certs\") pod \"konnectivity-agent-pkjxs\" (UID: \"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04\") " pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.152940 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.152918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxcn\" (UniqueName: \"kubernetes.io/projected/931659cd-b67a-416b-93b2-fc1c18e4a16e-kube-api-access-nqxcn\") pod \"tuned-v498x\" (UID: \"931659cd-b67a-416b-93b2-fc1c18e4a16e\") " pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.160978 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.160957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7fzx\" (UniqueName: \"kubernetes.io/projected/4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11-kube-api-access-c7fzx\") pod \"ovnkube-node-wv7qh\" (UID: \"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11\") " pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.162007 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.161799 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:23.163093 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.163024 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:23.163093 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.162927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-425fz\" (UniqueName: \"kubernetes.io/projected/f5dbdaee-eba8-485f-88bf-e781ddd5de4d-kube-api-access-425fz\") pod \"node-ca-4clws\" (UID: \"f5dbdaee-eba8-485f-88bf-e781ddd5de4d\") " pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.163093 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.163061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl2sm\" (UniqueName: \"kubernetes.io/projected/37f93754-2c70-4bb8-bd31-698cb86a7f0d-kube-api-access-bl2sm\") pod \"iptables-alerter-vlrjx\" (UID: \"37f93754-2c70-4bb8-bd31-698cb86a7f0d\") " pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.163268 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.163050 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.163268 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.163181 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:23.663161732 +0000 UTC m=+3.090662326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.163679 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.163641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrgw4\" (UniqueName: \"kubernetes.io/projected/9136f542-e29a-475e-9ef4-a5653b964224-kube-api-access-xrgw4\") pod \"multus-additional-cni-plugins-sbv79\" (UID: \"9136f542-e29a-475e-9ef4-a5653b964224\") " pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.164076 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.164048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnz4z\" (UniqueName: \"kubernetes.io/projected/b85a0f23-a9f0-408e-bac2-1d9cb2494ca3-kube-api-access-gnz4z\") pod \"aws-ebs-csi-driver-node-cqhfh\" (UID: \"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.164745 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.164727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhktr\" (UniqueName: \"kubernetes.io/projected/375f0483-eb31-462b-859e-b59ffc509ba2-kube-api-access-mhktr\") pod \"multus-h56jw\" (UID: \"375f0483-eb31-462b-859e-b59ffc509ba2\") " pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.165108 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.165093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mbtg\" (UniqueName: \"kubernetes.io/projected/0a2d2c94-3731-45d6-be85-14a1a081468a-kube-api-access-2mbtg\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.326358 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.326316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:23.333286 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.333261 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" Apr 28 19:16:23.341968 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.341947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4clws" Apr 28 19:16:23.348299 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.348277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sbv79" Apr 28 19:16:23.358914 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.358889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h56jw" Apr 28 19:16:23.365588 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.365562 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vlrjx" Apr 28 19:16:23.372135 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.372118 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v498x" Apr 28 19:16:23.377695 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.377675 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:23.651231 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.651185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:23.651375 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.651318 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.651448 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.651391 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.65137207 +0000 UTC m=+4.078872687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:23.719792 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.719612 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f93754_2c70_4bb8_bd31_698cb86a7f0d.slice/crio-9eb0ab62f4cc40a68b1b4cc27c67a8818d5cc4f63da8699bd3eb46128cc5b1c6 WatchSource:0}: Error finding container 9eb0ab62f4cc40a68b1b4cc27c67a8818d5cc4f63da8699bd3eb46128cc5b1c6: Status 404 returned error can't find the container with id 9eb0ab62f4cc40a68b1b4cc27c67a8818d5cc4f63da8699bd3eb46128cc5b1c6 Apr 28 19:16:23.720588 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.720567 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9136f542_e29a_475e_9ef4_a5653b964224.slice/crio-c48e58f612a957be518509efba4fe5572e2a818588110719ce987ee17a3f1d68 WatchSource:0}: Error finding container c48e58f612a957be518509efba4fe5572e2a818588110719ce987ee17a3f1d68: Status 404 returned error can't find the container with id c48e58f612a957be518509efba4fe5572e2a818588110719ce987ee17a3f1d68 Apr 28 19:16:23.722766 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.721840 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8cecbc4_1ccf_49b8_bd5f_a126bd910b04.slice/crio-53b8cf3e1d4d7c52115f89890e0ac59fcae894c1a75b20325d662473c7bcfc12 WatchSource:0}: Error finding container 53b8cf3e1d4d7c52115f89890e0ac59fcae894c1a75b20325d662473c7bcfc12: Status 404 returned error can't find the container with id 53b8cf3e1d4d7c52115f89890e0ac59fcae894c1a75b20325d662473c7bcfc12 Apr 28 19:16:23.724505 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.724486 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931659cd_b67a_416b_93b2_fc1c18e4a16e.slice/crio-fd51ff613f063dd525a94832834428df051374a7bd5eff4b826e1c8a5f2ba9b0 WatchSource:0}: Error finding container fd51ff613f063dd525a94832834428df051374a7bd5eff4b826e1c8a5f2ba9b0: Status 404 returned error can't find the container with id fd51ff613f063dd525a94832834428df051374a7bd5eff4b826e1c8a5f2ba9b0 Apr 28 19:16:23.725924 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.725835 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85a0f23_a9f0_408e_bac2_1d9cb2494ca3.slice/crio-38eccad0930f6ac6b32f2f162819ce0410e64ffaab11942d05e23093eebc3728 WatchSource:0}: Error finding container 38eccad0930f6ac6b32f2f162819ce0410e64ffaab11942d05e23093eebc3728: Status 404 returned error can't find the container with id 38eccad0930f6ac6b32f2f162819ce0410e64ffaab11942d05e23093eebc3728 Apr 28 19:16:23.726955 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.726905 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5dbdaee_eba8_485f_88bf_e781ddd5de4d.slice/crio-a98d267169362f2888418bfc79c257dff3097081fe525bc81d53ef2c31018686 WatchSource:0}: Error finding container a98d267169362f2888418bfc79c257dff3097081fe525bc81d53ef2c31018686: Status 404 returned error can't find the container with id a98d267169362f2888418bfc79c257dff3097081fe525bc81d53ef2c31018686 Apr 28 19:16:23.728512 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.728483 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b96f8c8_4344_49c6_8d8e_fdd7b4cf9a11.slice/crio-270edc2a10173f5c3a71b7a1a2d94ad6ab88c9b4b797fd7a614c208fdce60582 WatchSource:0}: Error finding container 270edc2a10173f5c3a71b7a1a2d94ad6ab88c9b4b797fd7a614c208fdce60582: Status 404 returned error can't find the container with id 270edc2a10173f5c3a71b7a1a2d94ad6ab88c9b4b797fd7a614c208fdce60582 Apr 28 19:16:23.729482 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:23.729458 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375f0483_eb31_462b_859e_b59ffc509ba2.slice/crio-425934737b7de29334049a7919a45f3c7d6c56c8b03c249e84cff6c36afd198f WatchSource:0}: Error finding container 425934737b7de29334049a7919a45f3c7d6c56c8b03c249e84cff6c36afd198f: Status 404 returned error can't find the container with id 425934737b7de29334049a7919a45f3c7d6c56c8b03c249e84cff6c36afd198f Apr 28 19:16:23.752567 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:23.752533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:23.752676 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.752659 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:23.752729 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.752680 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:23.752729 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.752690 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:23.752790 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:23.752735 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:24.752720489 +0000 UTC m=+4.180221084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:24.091443 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.091294 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:11:22 +0000 UTC" deadline="2028-01-12 12:49:35.294998586 +0000 UTC" Apr 28 19:16:24.091443 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.091335 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14969h33m11.203667114s" Apr 28 19:16:24.175453 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.175414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" event={"ID":"77fe4b4b44e63783dd24b8c6f6bea437","Type":"ContainerStarted","Data":"f544f724030ee17912ed6a76ffd112a33e7b79e6d00cb3e962f7e84fd7ee3d30"} Apr 28 19:16:24.184057 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.184019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h56jw" event={"ID":"375f0483-eb31-462b-859e-b59ffc509ba2","Type":"ContainerStarted","Data":"425934737b7de29334049a7919a45f3c7d6c56c8b03c249e84cff6c36afd198f"} Apr 28 19:16:24.191633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.191573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4clws" event={"ID":"f5dbdaee-eba8-485f-88bf-e781ddd5de4d","Type":"ContainerStarted","Data":"a98d267169362f2888418bfc79c257dff3097081fe525bc81d53ef2c31018686"} Apr 28 19:16:24.198338 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.198306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" event={"ID":"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3","Type":"ContainerStarted","Data":"38eccad0930f6ac6b32f2f162819ce0410e64ffaab11942d05e23093eebc3728"} Apr 28 19:16:24.199638 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.199613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vlrjx" event={"ID":"37f93754-2c70-4bb8-bd31-698cb86a7f0d","Type":"ContainerStarted","Data":"9eb0ab62f4cc40a68b1b4cc27c67a8818d5cc4f63da8699bd3eb46128cc5b1c6"} Apr 28 19:16:24.203177 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.203151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"270edc2a10173f5c3a71b7a1a2d94ad6ab88c9b4b797fd7a614c208fdce60582"} Apr 28 19:16:24.207964 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.207935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v498x" event={"ID":"931659cd-b67a-416b-93b2-fc1c18e4a16e","Type":"ContainerStarted","Data":"fd51ff613f063dd525a94832834428df051374a7bd5eff4b826e1c8a5f2ba9b0"} Apr 28 19:16:24.212724 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.212696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pkjxs" event={"ID":"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04","Type":"ContainerStarted","Data":"53b8cf3e1d4d7c52115f89890e0ac59fcae894c1a75b20325d662473c7bcfc12"} Apr 28 19:16:24.216811 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.216787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerStarted","Data":"c48e58f612a957be518509efba4fe5572e2a818588110719ce987ee17a3f1d68"} Apr 28 19:16:24.660651 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.660614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:24.660844 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.660781 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:24.660907 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.660846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.660827179 +0000 UTC m=+6.088327774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:24.762519 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.761840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:24.762519 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.762010 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:24.762519 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.762032 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:24.762519 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.762045 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:24.762519 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.762102 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.762084047 +0000 UTC m=+6.189584655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:24.792156 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.791899 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-230.ec2.internal" podStartSLOduration=2.791878527 podStartE2EDuration="2.791878527s" podCreationTimestamp="2026-04-28 19:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:24.194637968 +0000 UTC m=+3.622138586" watchObservedRunningTime="2026-04-28 19:16:24.791878527 +0000 UTC m=+4.219379149" Apr 28 19:16:24.792345 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.792276 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fvgjn"] Apr 28 19:16:24.795396 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.795244 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.795396 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.795320 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:24.862929 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.862892 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-dbus\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.863114 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.862948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-kubelet-config\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.863114 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.863052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.964184 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.964142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-dbus\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.964391 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.964217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-kubelet-config\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.964391 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.964298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.964508 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.964425 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:24.964508 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:24.964487 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:25.464469423 +0000 UTC m=+4.891970020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:24.964866 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.964843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-dbus\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:24.964953 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:24.964919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-kubelet-config\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:25.169600 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:25.169522 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:25.170130 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:25.169651 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:25.170130 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:25.169735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:25.170130 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:25.169854 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:25.228235 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:25.228165 2572 generic.go:358] "Generic (PLEG): container finished" podID="12ae3e9fbde7a26e5bd78fbf1749835b" containerID="c09c59419c53858a5fcd8615ed56274cedcf78e1054acf93d3d6fe5460666a68" exitCode=0 Apr 28 19:16:25.228414 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:25.228332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" event={"ID":"12ae3e9fbde7a26e5bd78fbf1749835b","Type":"ContainerDied","Data":"c09c59419c53858a5fcd8615ed56274cedcf78e1054acf93d3d6fe5460666a68"} Apr 28 19:16:25.469328 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:25.469159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:25.469476 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:25.469346 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:25.469476 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:25.469411 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.469393538 +0000 UTC m=+5.896894156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:26.163481 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.163444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:26.163673 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.163580 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:26.234530 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.233868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" event={"ID":"12ae3e9fbde7a26e5bd78fbf1749835b","Type":"ContainerStarted","Data":"696d47b7d43863f5c672e9bc9f13099feb789106beb2ebd5a31de06d9db4514f"} Apr 28 19:16:26.247475 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.247269 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-230.ec2.internal" podStartSLOduration=4.24725255 podStartE2EDuration="4.24725255s" podCreationTimestamp="2026-04-28 19:16:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:26.24713011 +0000 UTC m=+5.674630729" watchObservedRunningTime="2026-04-28 19:16:26.24725255 +0000 UTC m=+5.674753169" Apr 28 19:16:26.478012 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.477919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:26.478163 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.478110 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:26.478235 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.478172 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:28.478153653 +0000 UTC m=+7.905654247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:26.679577 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.679513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:26.679760 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.679706 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.679841 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.679772 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:30.67975267 +0000 UTC m=+10.107253281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:26.779996 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:26.779903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:26.780177 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.780161 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:26.780275 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.780184 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:26.780275 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.780213 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:26.780391 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:26.780280 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:30.780261128 +0000 UTC m=+10.207761729 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:27.167008 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:27.166973 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:27.167217 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:27.167087 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:27.167397 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:27.167372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:27.167763 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:27.167493 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:28.163456 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:28.163418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:28.163931 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:28.163564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:28.496758 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:28.496628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:28.496921 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:28.496791 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:28.496921 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:28.496876 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.496854925 +0000 UTC m=+11.924355523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:29.163055 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:29.163016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:29.163277 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:29.163016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:29.163277 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:29.163165 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:29.163277 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:29.163212 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:30.163503 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:30.163458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:30.163999 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.163596 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:30.717140 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:30.717100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:30.717330 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.717259 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:30.717391 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.717343 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:38.717320449 +0000 UTC m=+18.144821052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:30.819991 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:30.819937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:30.820255 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.820231 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:30.820370 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.820259 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:30.820370 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.820274 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:30.820370 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:30.820346 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:38.820325614 +0000 UTC m=+18.247826216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:31.164080 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:31.164041 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:31.164556 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:31.164171 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:31.164556 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:31.164232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:31.164556 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:31.164345 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:32.163738 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:32.163697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:32.163906 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:32.163825 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:32.534486 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:32.534389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:32.534919 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:32.534560 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:32.534919 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:32.534643 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:40.534620324 +0000 UTC m=+19.962120922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:33.163722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:33.163679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:33.163890 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:33.163811 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:33.163890 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:33.163868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:33.164012 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:33.163989 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:34.163023 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:34.162984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:34.163490 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:34.163120 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:35.162886 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:35.162840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:35.163070 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:35.162840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:35.163070 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:35.162976 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:35.163070 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:35.163016 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:36.162754 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:36.162713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:36.162943 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:36.162846 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:37.032018 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.031987 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9rm7p"] Apr 28 19:16:37.060798 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.060765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.065251 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.065229 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:16:37.065591 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.065568 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:16:37.065715 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.065618 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pvc6r\"" Apr 28 19:16:37.162946 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.162878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:37.163134 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:37.163031 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:37.163278 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.163256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:37.163390 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:37.163368 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:37.168701 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.168677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccda02a-118f-4d0d-858f-6d03050f92de-tmp-dir\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.168850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.168737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eccda02a-118f-4d0d-858f-6d03050f92de-hosts-file\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.168850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.168775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54lr4\" (UniqueName: \"kubernetes.io/projected/eccda02a-118f-4d0d-858f-6d03050f92de-kube-api-access-54lr4\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.269706 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.269663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54lr4\" (UniqueName: \"kubernetes.io/projected/eccda02a-118f-4d0d-858f-6d03050f92de-kube-api-access-54lr4\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.269887 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.269764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccda02a-118f-4d0d-858f-6d03050f92de-tmp-dir\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.269887 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.269803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eccda02a-118f-4d0d-858f-6d03050f92de-hosts-file\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.270005 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.269888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eccda02a-118f-4d0d-858f-6d03050f92de-hosts-file\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.270219 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.270177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccda02a-118f-4d0d-858f-6d03050f92de-tmp-dir\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.281570 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.281538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54lr4\" (UniqueName: \"kubernetes.io/projected/eccda02a-118f-4d0d-858f-6d03050f92de-kube-api-access-54lr4\") pod \"node-resolver-9rm7p\" (UID: \"eccda02a-118f-4d0d-858f-6d03050f92de\") " pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:37.370590 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:37.370547 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9rm7p" Apr 28 19:16:38.162997 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:38.162963 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:38.163441 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.163074 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:38.782304 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:38.782258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:38.782556 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.782403 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:38.782556 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.782473 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.782453457 +0000 UTC m=+34.209954057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:38.883218 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:38.883165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:38.883411 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.883333 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:38.883411 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.883350 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:38.883411 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.883359 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:38.883563 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:38.883414 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.883395937 +0000 UTC m=+34.310896541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:39.163713 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:39.163670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:39.164182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:39.163670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:39.164182 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:39.163842 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:39.164182 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:39.163979 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:40.163144 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:40.163113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:40.163383 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:40.163232 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:40.386475 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:40.386447 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeccda02a_118f_4d0d_858f_6d03050f92de.slice/crio-60a2df0aebf75d9628e623d87648eb7a3e0b2b23450854ffdc2489751c05c33f WatchSource:0}: Error finding container 60a2df0aebf75d9628e623d87648eb7a3e0b2b23450854ffdc2489751c05c33f: Status 404 returned error can't find the container with id 60a2df0aebf75d9628e623d87648eb7a3e0b2b23450854ffdc2489751c05c33f Apr 28 19:16:40.595683 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:40.595647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:40.595795 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:40.595779 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:40.595862 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:40.595837 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret podName:f4fd0bdf-e317-4621-b4cf-c41c8e666b62 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:56.595819143 +0000 UTC m=+36.023319756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret") pod "global-pull-secret-syncer-fvgjn" (UID: "f4fd0bdf-e317-4621-b4cf-c41c8e666b62") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:41.164398 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.164153 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:41.164565 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.164248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:41.164565 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:41.164476 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:41.164565 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:41.164538 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:41.260987 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.260894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pkjxs" event={"ID":"f8cecbc4-1ccf-49b8-bd5f-a126bd910b04","Type":"ContainerStarted","Data":"85cd46ac723b61aaa3db530983b3f3b65bbc6e40e3377e97c99b81598d3fb2ff"} Apr 28 19:16:41.262800 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.262768 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="bcce240cf23d515de4c56c3d1b7cf2e0b027fac80ed0d82ae3f8325367f2f60d" exitCode=0 Apr 28 19:16:41.262912 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.262844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"bcce240cf23d515de4c56c3d1b7cf2e0b027fac80ed0d82ae3f8325367f2f60d"} Apr 28 19:16:41.264375 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.264316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h56jw" event={"ID":"375f0483-eb31-462b-859e-b59ffc509ba2","Type":"ContainerStarted","Data":"3d72d9cf28ede4fbfb2e5547f231b265f4924b152b0baac65a2ba508fed5dc9f"} Apr 28 19:16:41.265841 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.265782 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4clws" event={"ID":"f5dbdaee-eba8-485f-88bf-e781ddd5de4d","Type":"ContainerStarted","Data":"3d79952c58f947e24dc696a75f8ba8c66808516df70eea5ce203ee556bd1ba3c"} Apr 28 19:16:41.267134 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.267115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" event={"ID":"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3","Type":"ContainerStarted","Data":"2f8ae7b08da847e362c65dfd66d3de7e57336dbb94c3c7b19d86807e43b03972"} Apr 28 19:16:41.268484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.268459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9rm7p" event={"ID":"eccda02a-118f-4d0d-858f-6d03050f92de","Type":"ContainerStarted","Data":"6ae2c0dc3247e456b60a31d15d6031c67f672af14d6f1822a73b976a2f53e4db"} Apr 28 19:16:41.268564 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.268487 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9rm7p" event={"ID":"eccda02a-118f-4d0d-858f-6d03050f92de","Type":"ContainerStarted","Data":"60a2df0aebf75d9628e623d87648eb7a3e0b2b23450854ffdc2489751c05c33f"} Apr 28 19:16:41.271177 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"8d3a5e2ef202c39ee8a7eb22ca010ad009653aa0a1b2c016947d45de0cdd0c1c"} Apr 28 19:16:41.271275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"4180f4d04cd45a70b9f3011b7ecf0aa0e20f86453613dc51721290835760ba42"} Apr 28 19:16:41.271275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"246c2404503266f2c3e2a0e4cc7b36e7ebdad98a8542a6453ea95ae895a5b3f4"} Apr 28 19:16:41.271275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"5503fe0f592c001617584d46a5f8aa2bf9403050a4c9a3d58f1dbb0011c0f416"} Apr 28 19:16:41.271275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271232 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"636b6dba01385eecbaae8b55e3524175c79002b0d970e68dd15b733ca29e151f"} Apr 28 19:16:41.271275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.271244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"07880e70b1e762f108b3a7f3d87897e0af4280c5fe697832f442156ca38c67bb"} Apr 28 19:16:41.272489 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.272464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v498x" event={"ID":"931659cd-b67a-416b-93b2-fc1c18e4a16e","Type":"ContainerStarted","Data":"8ca9b3f53aafe7fe4645c4c5ddb7af59392e76a2e3fac75f2ed95df4a0bb7d88"} Apr 28 19:16:41.300475 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.300426 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pkjxs" podStartSLOduration=3.655775336 podStartE2EDuration="20.300410767s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.724526474 +0000 UTC m=+3.152027069" lastFinishedPulling="2026-04-28 19:16:40.369161892 +0000 UTC m=+19.796662500" observedRunningTime="2026-04-28 19:16:41.280409462 +0000 UTC m=+20.707910081" watchObservedRunningTime="2026-04-28 19:16:41.300410767 +0000 UTC m=+20.727911365" Apr 28 19:16:41.300633 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.300609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4clws" podStartSLOduration=8.288987852 podStartE2EDuration="20.300603928s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.729187103 +0000 UTC m=+3.156687699" lastFinishedPulling="2026-04-28 19:16:35.740803167 +0000 UTC m=+15.168303775" observedRunningTime="2026-04-28 19:16:41.30055286 +0000 UTC m=+20.728053477" watchObservedRunningTime="2026-04-28 19:16:41.300603928 +0000 UTC m=+20.728104566" Apr 28 19:16:41.321128 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.321077 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v498x" podStartSLOduration=3.6642901979999998 podStartE2EDuration="20.321060258s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.726509983 +0000 UTC m=+3.154010579" lastFinishedPulling="2026-04-28 19:16:40.383280039 +0000 UTC m=+19.810780639" observedRunningTime="2026-04-28 19:16:41.320611972 +0000 UTC m=+20.748112590" watchObservedRunningTime="2026-04-28 19:16:41.321060258 +0000 UTC m=+20.748560858" Apr 28 19:16:41.365542 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.365481 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h56jw" podStartSLOduration=3.67875618 podStartE2EDuration="20.365465383s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.731342341 +0000 UTC m=+3.158842936" lastFinishedPulling="2026-04-28 19:16:40.41805154 +0000 UTC m=+19.845552139" observedRunningTime="2026-04-28 19:16:41.365189157 +0000 UTC m=+20.792689775" watchObservedRunningTime="2026-04-28 19:16:41.365465383 +0000 UTC m=+20.792966001" Apr 28 19:16:41.386728 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.386668 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9rm7p" podStartSLOduration=4.386648866 podStartE2EDuration="4.386648866s" podCreationTimestamp="2026-04-28 19:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:41.386540198 +0000 UTC m=+20.814040815" watchObservedRunningTime="2026-04-28 19:16:41.386648866 +0000 UTC m=+20.814149484" Apr 28 19:16:41.692846 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:41.692671 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:42.104076 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.103980 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:41.692841506Z","UUID":"d6026ebd-2094-44a6-b2e9-cbb92c7ce333","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:42.105925 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.105899 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:42.106072 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.105932 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:42.163591 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.163551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:42.163767 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:42.163697 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:42.276512 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.276461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" event={"ID":"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3","Type":"ContainerStarted","Data":"840988c9c043fe129b4ab690eaad1201ecc68dd0dd7883d1b27dc41df7edc1ac"} Apr 28 19:16:42.278814 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.278757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vlrjx" event={"ID":"37f93754-2c70-4bb8-bd31-698cb86a7f0d","Type":"ContainerStarted","Data":"c3fc2f5feb93153869f5024eadcb24ee7a157a709f11d600d1c213d8a5068de2"} Apr 28 19:16:42.297612 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:42.297554 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vlrjx" podStartSLOduration=4.638501543 podStartE2EDuration="21.29753584s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.722024589 +0000 UTC m=+3.149525203" lastFinishedPulling="2026-04-28 19:16:40.381058906 +0000 UTC m=+19.808559500" observedRunningTime="2026-04-28 19:16:42.297482469 +0000 UTC m=+21.724983087" watchObservedRunningTime="2026-04-28 19:16:42.29753584 +0000 UTC m=+21.725036458" Apr 28 19:16:43.163599 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.163565 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:43.164095 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.163570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:43.164095 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:43.163702 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:43.164095 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:43.163750 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:43.284157 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.284109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"73f684dda4785eaca15849843fe4310472b34277d1a2fcf32635694df8f68de6"} Apr 28 19:16:43.286107 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.286077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" event={"ID":"b85a0f23-a9f0-408e-bac2-1d9cb2494ca3","Type":"ContainerStarted","Data":"3270e941085cb95a326dda0fd81af95843593fadf3b64a5882992594ffddc680"} Apr 28 19:16:43.307816 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.307755 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cqhfh" podStartSLOduration=3.382977564 podStartE2EDuration="22.30773383s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.727903343 +0000 UTC m=+3.155403946" lastFinishedPulling="2026-04-28 19:16:42.652659606 +0000 UTC m=+22.080160212" observedRunningTime="2026-04-28 19:16:43.306938029 +0000 UTC m=+22.734438646" watchObservedRunningTime="2026-04-28 19:16:43.30773383 +0000 UTC m=+22.735234449" Apr 28 19:16:43.461903 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.461863 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:43.462516 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:43.462490 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:44.163361 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:44.163330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:44.163539 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:44.163444 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:44.288403 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:44.288368 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:44.289049 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:44.288807 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pkjxs" Apr 28 19:16:45.163604 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.163569 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:45.163773 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.163620 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:45.163842 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:45.163755 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:45.163947 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:45.163924 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:45.294170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.293804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" event={"ID":"4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11","Type":"ContainerStarted","Data":"ac571b808d4370b7fb52840097734a6075450fbd1ef1fd9082528c7ac0753587"} Apr 28 19:16:45.294170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.294075 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:45.294170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.294096 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:45.294170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.294109 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:45.296005 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.295978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerStarted","Data":"a2e8b8cb9c68664c960a62da1192172630f2ab9b879e73df8b5ef6ffeeb6d16a"} Apr 28 19:16:45.311385 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.311355 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:45.311511 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.311465 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:16:45.326270 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:45.326222 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" podStartSLOduration=7.43347964 podStartE2EDuration="24.326192629s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.73098449 +0000 UTC m=+3.158485086" lastFinishedPulling="2026-04-28 19:16:40.623697463 +0000 UTC m=+20.051198075" observedRunningTime="2026-04-28 19:16:45.325901897 +0000 UTC m=+24.753402514" watchObservedRunningTime="2026-04-28 19:16:45.326192629 +0000 UTC m=+24.753693288" Apr 28 19:16:46.163656 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:46.163613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:46.163862 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:46.163736 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:46.298677 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:46.298641 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="a2e8b8cb9c68664c960a62da1192172630f2ab9b879e73df8b5ef6ffeeb6d16a" exitCode=0 Apr 28 19:16:46.299240 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:46.298689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"a2e8b8cb9c68664c960a62da1192172630f2ab9b879e73df8b5ef6ffeeb6d16a"} Apr 28 19:16:47.146004 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.145757 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvgjn"] Apr 28 19:16:47.146182 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.146132 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:47.146309 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:47.146286 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:47.146677 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.146628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t8qc5"] Apr 28 19:16:47.146866 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.146771 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:47.146939 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:47.146870 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:47.147318 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.147299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ggtvc"] Apr 28 19:16:47.147417 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.147403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:47.147519 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:47.147502 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:47.302909 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.302826 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="348b49afcb2e28d5150405412591a436948bc66909ac2b1440148f4db7036510" exitCode=0 Apr 28 19:16:47.303303 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:47.302903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"348b49afcb2e28d5150405412591a436948bc66909ac2b1440148f4db7036510"} Apr 28 19:16:48.306722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:48.306640 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="fe531f53541933d86ed02bd36fe44332bac542274af1f2b5b93450309208b71e" exitCode=0 Apr 28 19:16:48.306722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:48.306691 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"fe531f53541933d86ed02bd36fe44332bac542274af1f2b5b93450309208b71e"} Apr 28 19:16:49.163669 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:49.163637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:49.163842 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:49.163634 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:49.163842 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:49.163774 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:49.163842 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:49.163637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:49.163989 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:49.163878 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:49.163989 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:49.163964 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:51.164021 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:51.163981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:51.164836 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:51.164103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:51.164836 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:51.164113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:51.164836 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:51.164141 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:51.164836 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:51.164192 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:51.164836 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:51.164289 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:53.162984 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.162944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:53.162984 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.162986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:53.163613 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.163077 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvgjn" podUID="f4fd0bdf-e317-4621-b4cf-c41c8e666b62" Apr 28 19:16:53.163613 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.163196 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:16:53.163613 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.163255 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:53.163613 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.163384 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t8qc5" podUID="d7e741f6-720c-42da-b861-0d9702bff94d" Apr 28 19:16:53.400028 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.399999 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-230.ec2.internal" event="NodeReady" Apr 28 19:16:53.400236 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.400133 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:53.464947 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.464915 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7826r"] Apr 28 19:16:53.495079 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.495044 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wl2q4"] Apr 28 19:16:53.495252 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.495162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.497852 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.497814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:53.497970 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.497945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:53.498027 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.497818 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:16:53.510027 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.510002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7826r"] Apr 28 19:16:53.510143 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.510084 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wl2q4"] Apr 28 19:16:53.510196 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.510150 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:53.514572 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.514524 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:53.514692 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.514591 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:53.514761 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.514725 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:53.514971 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.514944 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:16:53.600793 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.600754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfzhm\" (UniqueName: \"kubernetes.io/projected/f72b413b-cc77-44a9-9bab-db02d985886b-kube-api-access-hfzhm\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.601013 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.600804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.601013 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.600835 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f72b413b-cc77-44a9-9bab-db02d985886b-config-volume\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.601013 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.600914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72b413b-cc77-44a9-9bab-db02d985886b-tmp-dir\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.601013 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.600951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:53.601234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.601042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcqw\" (UniqueName: \"kubernetes.io/projected/e361f8d4-ac75-4139-b8c7-fada2556f305-kube-api-access-7jcqw\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:53.702005 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.701916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfzhm\" (UniqueName: \"kubernetes.io/projected/f72b413b-cc77-44a9-9bab-db02d985886b-kube-api-access-hfzhm\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.702005 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.701959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.702005 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.701986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f72b413b-cc77-44a9-9bab-db02d985886b-config-volume\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.702018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72b413b-cc77-44a9-9bab-db02d985886b-tmp-dir\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.702045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.702085 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.702130 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.702151 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.20213108 +0000 UTC m=+33.629631693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:53.702169 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:54.202160192 +0000 UTC m=+33.629660801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:16:53.702310 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.702215 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcqw\" (UniqueName: \"kubernetes.io/projected/e361f8d4-ac75-4139-b8c7-fada2556f305-kube-api-access-7jcqw\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:53.702652 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.702569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f72b413b-cc77-44a9-9bab-db02d985886b-tmp-dir\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.702747 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.702724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f72b413b-cc77-44a9-9bab-db02d985886b-config-volume\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.713807 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.713620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfzhm\" (UniqueName: \"kubernetes.io/projected/f72b413b-cc77-44a9-9bab-db02d985886b-kube-api-access-hfzhm\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:53.715155 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:53.715128 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcqw\" (UniqueName: \"kubernetes.io/projected/e361f8d4-ac75-4139-b8c7-fada2556f305-kube-api-access-7jcqw\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:54.207327 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:54.207281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:54.207720 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:54.207371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:54.207720 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.207455 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:54.207720 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.207457 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:54.207720 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.207506 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:55.207491778 +0000 UTC m=+34.634992373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:16:54.207720 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.207520 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:55.207513175 +0000 UTC m=+34.635013769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:16:54.812215 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:54.812107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:54.812446 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.812284 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:54.812446 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.812356 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:17:26.812339238 +0000 UTC m=+66.239839834 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:54.912667 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:54.912627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:54.912828 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.912808 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:54.912866 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.912834 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:54.912866 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.912847 2572 projected.go:194] Error preparing data for projected volume kube-api-access-cpl62 for pod openshift-network-diagnostics/network-check-target-t8qc5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:54.912927 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:54.912902 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62 podName:d7e741f6-720c-42da-b861-0d9702bff94d nodeName:}" failed. No retries permitted until 2026-04-28 19:17:26.91288796 +0000 UTC m=+66.340388555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cpl62" (UniqueName: "kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62") pod "network-check-target-t8qc5" (UID: "d7e741f6-720c-42da-b861-0d9702bff94d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:55.163660 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.163624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:16:55.163660 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.163650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:16:55.163912 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.163624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:55.168125 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.168101 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:55.168289 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.168136 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:55.168289 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.168269 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:55.169237 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.169219 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:16:55.170937 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.170920 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:55.171042 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.170986 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tslnt\"" Apr 28 19:16:55.214478 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.214436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:55.214478 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.214483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:55.214860 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:55.214601 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:55.214860 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:55.214606 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:55.214860 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:55.214657 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:57.214638792 +0000 UTC m=+36.642139387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:16:55.214860 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:55.214670 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:16:57.214664893 +0000 UTC m=+36.642165488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:16:55.324330 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.324297 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="1c3f3f2e01940061241093f5efd2dfa51ff522f158d04f004418265aa3966f14" exitCode=0 Apr 28 19:16:55.324491 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:55.324355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"1c3f3f2e01940061241093f5efd2dfa51ff522f158d04f004418265aa3966f14"} Apr 28 19:16:56.329080 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.329045 2572 generic.go:358] "Generic (PLEG): container finished" podID="9136f542-e29a-475e-9ef4-a5653b964224" containerID="02cb48505f9e45c97a7a3f201b5648e153cb70ec34dd66df93c1d109080f7b25" exitCode=0 Apr 28 19:16:56.329080 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.329084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerDied","Data":"02cb48505f9e45c97a7a3f201b5648e153cb70ec34dd66df93c1d109080f7b25"} Apr 28 19:16:56.625338 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.625301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:56.628403 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.628381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f4fd0bdf-e317-4621-b4cf-c41c8e666b62-original-pull-secret\") pod \"global-pull-secret-syncer-fvgjn\" (UID: \"f4fd0bdf-e317-4621-b4cf-c41c8e666b62\") " pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:56.679407 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.679372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvgjn" Apr 28 19:16:56.836628 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:56.836383 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvgjn"] Apr 28 19:16:56.841671 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:16:56.841643 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fd0bdf_e317_4621_b4cf_c41c8e666b62.slice/crio-dbe9b47d8f2d5e8762d026358fa56f91b1e26aad29a7747cd7e6ea2459fcc447 WatchSource:0}: Error finding container dbe9b47d8f2d5e8762d026358fa56f91b1e26aad29a7747cd7e6ea2459fcc447: Status 404 returned error can't find the container with id dbe9b47d8f2d5e8762d026358fa56f91b1e26aad29a7747cd7e6ea2459fcc447 Apr 28 19:16:57.229111 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:57.229075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:16:57.229314 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:57.229131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:16:57.229314 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:57.229273 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:57.229427 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:57.229361 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.229337965 +0000 UTC m=+40.656838584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:16:57.229427 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:57.229275 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:57.229515 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:16:57.229447 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:01.229427171 +0000 UTC m=+40.656927787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:16:57.332736 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:57.332697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvgjn" event={"ID":"f4fd0bdf-e317-4621-b4cf-c41c8e666b62","Type":"ContainerStarted","Data":"dbe9b47d8f2d5e8762d026358fa56f91b1e26aad29a7747cd7e6ea2459fcc447"} Apr 28 19:16:57.335673 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:57.335649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sbv79" event={"ID":"9136f542-e29a-475e-9ef4-a5653b964224","Type":"ContainerStarted","Data":"2451de79b57711eb3205b57da1baf21afd38340068d7337667df244cabb0ca59"} Apr 28 19:16:57.360768 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:16:57.360719 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sbv79" podStartSLOduration=5.792403523 podStartE2EDuration="36.360704191s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:16:23.724563863 +0000 UTC m=+3.152064462" lastFinishedPulling="2026-04-28 19:16:54.292864533 +0000 UTC m=+33.720365130" observedRunningTime="2026-04-28 19:16:57.359213033 +0000 UTC m=+36.786713644" watchObservedRunningTime="2026-04-28 19:16:57.360704191 +0000 UTC m=+36.788204809" Apr 28 19:17:01.257665 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:01.257624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:17:01.257665 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:01.257671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:17:01.258345 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:01.257769 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:01.258345 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:01.257783 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:01.258345 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:01.257821 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:09.257806401 +0000 UTC m=+48.685306996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:17:01.258345 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:01.257846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:17:09.257828545 +0000 UTC m=+48.685329145 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:17:01.344113 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:01.344070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvgjn" event={"ID":"f4fd0bdf-e317-4621-b4cf-c41c8e666b62","Type":"ContainerStarted","Data":"f1f5fddef43d72fcf17db77f552ec29c75374cdda67c95d7965b177a10c90bd7"} Apr 28 19:17:01.360923 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:01.360859 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fvgjn" podStartSLOduration=33.439486715 podStartE2EDuration="37.360844689s" podCreationTimestamp="2026-04-28 19:16:24 +0000 UTC" firstStartedPulling="2026-04-28 19:16:56.843158431 +0000 UTC m=+36.270659026" lastFinishedPulling="2026-04-28 19:17:00.764516404 +0000 UTC m=+40.192017000" observedRunningTime="2026-04-28 19:17:01.360314118 +0000 UTC m=+40.787814735" watchObservedRunningTime="2026-04-28 19:17:01.360844689 +0000 UTC m=+40.788345305" Apr 28 19:17:09.310738 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:09.310692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:17:09.310738 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:09.310745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:17:09.311303 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:09.310841 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:09.311303 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:09.310859 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:09.311303 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:09.310903 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:25.310889216 +0000 UTC m=+64.738389811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:17:09.311303 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:09.310937 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:17:25.310917612 +0000 UTC m=+64.738418221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:17:14.321042 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.320998 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78"] Apr 28 19:17:14.324146 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.324130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.327240 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.327217 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 28 19:17:14.327486 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.327469 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:17:14.327605 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.327575 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 28 19:17:14.327783 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.327769 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:17:14.328015 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.327997 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:17:14.328571 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.328552 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 28 19:17:14.328675 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.328557 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 28 19:17:14.345215 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.345342 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.345342 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n27h\" (UniqueName: \"kubernetes.io/projected/faf72846-cb03-4e4c-986f-947dea5fb0fb-kube-api-access-8n27h\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.345342 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/faf72846-cb03-4e4c-986f-947dea5fb0fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.345342 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.345509 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.345374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.349811 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.349786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78"] Apr 28 19:17:14.445851 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.445800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.445851 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.445855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.446098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.445899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n27h\" (UniqueName: \"kubernetes.io/projected/faf72846-cb03-4e4c-986f-947dea5fb0fb-kube-api-access-8n27h\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.446098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.445960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/faf72846-cb03-4e4c-986f-947dea5fb0fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.446098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.445978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.446098 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.446007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.446659 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.446633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/faf72846-cb03-4e4c-986f-947dea5fb0fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.449103 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.449077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.449326 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.449181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.449326 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.449215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-ca\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.449453 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.449434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/faf72846-cb03-4e4c-986f-947dea5fb0fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.459576 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.459550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n27h\" (UniqueName: \"kubernetes.io/projected/faf72846-cb03-4e4c-986f-947dea5fb0fb-kube-api-access-8n27h\") pod \"cluster-proxy-proxy-agent-dcfc455bf-p2k78\" (UID: \"faf72846-cb03-4e4c-986f-947dea5fb0fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.648802 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.648767 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:17:14.770331 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:14.770302 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78"] Apr 28 19:17:14.773141 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:17:14.773114 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf72846_cb03_4e4c_986f_947dea5fb0fb.slice/crio-ef77e4924b1432e52ce3d9769dc1b0cbd65daf57766ce8a84363d84a110f9380 WatchSource:0}: Error finding container ef77e4924b1432e52ce3d9769dc1b0cbd65daf57766ce8a84363d84a110f9380: Status 404 returned error can't find the container with id ef77e4924b1432e52ce3d9769dc1b0cbd65daf57766ce8a84363d84a110f9380 Apr 28 19:17:15.372019 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:15.371982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerStarted","Data":"ef77e4924b1432e52ce3d9769dc1b0cbd65daf57766ce8a84363d84a110f9380"} Apr 28 19:17:17.321181 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:17.321152 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wv7qh" Apr 28 19:17:18.379588 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:18.379556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerStarted","Data":"5902cb9de6b7b08df76c276876b5968af2c270e23d3505fae5c48350bf726602"} Apr 28 19:17:21.387659 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:21.387614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerStarted","Data":"e390ef80604c2b4d250045bc0c66b4b78d73ff58140fba650da24107120868a5"} Apr 28 19:17:21.387659 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:21.387663 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerStarted","Data":"15128580b8e1e2eb5fcaf2dc08190d636e558de4b33586ace39e0e6a13e5f16e"} Apr 28 19:17:21.410450 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:21.410398 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" podStartSLOduration=1.487795248 podStartE2EDuration="7.410382659s" podCreationTimestamp="2026-04-28 19:17:14 +0000 UTC" firstStartedPulling="2026-04-28 19:17:14.774842133 +0000 UTC m=+54.202342728" lastFinishedPulling="2026-04-28 19:17:20.697429544 +0000 UTC m=+60.124930139" observedRunningTime="2026-04-28 19:17:21.409311137 +0000 UTC m=+60.836811752" watchObservedRunningTime="2026-04-28 19:17:21.410382659 +0000 UTC m=+60.837883254" Apr 28 19:17:25.326671 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:25.326624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:17:25.326671 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:25.326677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:17:25.327161 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:25.326771 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:25.327161 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:25.326852 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:17:57.326835267 +0000 UTC m=+96.754335867 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:17:25.327161 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:25.326774 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:25.327161 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:25.326906 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:57.326894706 +0000 UTC m=+96.754395301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:17:26.839846 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.839805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:17:26.843467 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.843445 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:17:26.850658 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:26.850639 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:26.850735 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:26.850700 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:30.850680168 +0000 UTC m=+130.278180763 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : secret "metrics-daemon-secret" not found Apr 28 19:17:26.940854 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.940812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:17:26.944099 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.944075 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:17:26.954259 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.954230 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:17:26.963944 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.963919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpl62\" (UniqueName: \"kubernetes.io/projected/d7e741f6-720c-42da-b861-0d9702bff94d-kube-api-access-cpl62\") pod \"network-check-target-t8qc5\" (UID: \"d7e741f6-720c-42da-b861-0d9702bff94d\") " pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:17:26.976067 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.976044 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tslnt\"" Apr 28 19:17:26.983287 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:26.983264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:17:27.095472 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:27.095398 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t8qc5"] Apr 28 19:17:27.098696 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:17:27.098658 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e741f6_720c_42da_b861_0d9702bff94d.slice/crio-1dd0a210375629ecf57c2f208f8240e183e8e8d53e05e7312f71dca085f20275 WatchSource:0}: Error finding container 1dd0a210375629ecf57c2f208f8240e183e8e8d53e05e7312f71dca085f20275: Status 404 returned error can't find the container with id 1dd0a210375629ecf57c2f208f8240e183e8e8d53e05e7312f71dca085f20275 Apr 28 19:17:27.401085 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:27.401052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t8qc5" event={"ID":"d7e741f6-720c-42da-b861-0d9702bff94d","Type":"ContainerStarted","Data":"1dd0a210375629ecf57c2f208f8240e183e8e8d53e05e7312f71dca085f20275"} Apr 28 19:17:30.407449 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:30.407408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t8qc5" event={"ID":"d7e741f6-720c-42da-b861-0d9702bff94d","Type":"ContainerStarted","Data":"6570327b8968dbf86262915f62d026bc28aa3cc1df8ceee488f62955665b7165"} Apr 28 19:17:30.407886 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:30.407524 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:17:30.425756 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:30.425705 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t8qc5" podStartSLOduration=66.829265709 podStartE2EDuration="1m9.425691071s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:17:27.100453799 +0000 UTC m=+66.527954394" lastFinishedPulling="2026-04-28 19:17:29.696879144 +0000 UTC m=+69.124379756" observedRunningTime="2026-04-28 19:17:30.424228492 +0000 UTC m=+69.851729102" watchObservedRunningTime="2026-04-28 19:17:30.425691071 +0000 UTC m=+69.853191687" Apr 28 19:17:57.360092 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:57.360039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:17:57.360092 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:17:57.360102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:17:57.360598 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:57.360182 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:57.360598 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:57.360190 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:57.360598 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:57.360265 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:01.360250185 +0000 UTC m=+160.787750779 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:17:57.360598 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:17:57.360277 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:19:01.360271663 +0000 UTC m=+160.787772257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:18:01.411830 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:01.411797 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t8qc5" Apr 28 19:18:30.889548 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:30.889500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:18:30.890092 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:18:30.889645 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:30.890092 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:18:30.889708 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs podName:0a2d2c94-3731-45d6-be85-14a1a081468a nodeName:}" failed. No retries permitted until 2026-04-28 19:20:32.889692127 +0000 UTC m=+252.317192726 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs") pod "network-metrics-daemon-ggtvc" (UID: "0a2d2c94-3731-45d6-be85-14a1a081468a") : secret "metrics-daemon-secret" not found Apr 28 19:18:49.718682 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.718642 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q"] Apr 28 19:18:49.721563 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.721536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.725107 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.725071 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 28 19:18:49.725285 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.725267 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:49.726243 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.726226 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 28 19:18:49.726323 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.726226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8fkdd\"" Apr 28 19:18:49.726323 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.726258 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:49.739502 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.739458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q"] Apr 28 19:18:49.818224 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.818167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwt58\" (UniqueName: \"kubernetes.io/projected/2fa50d25-bdbc-446f-a7b9-de3696866c07-kube-api-access-pwt58\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.818403 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.818248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa50d25-bdbc-446f-a7b9-de3696866c07-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.818403 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.818293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa50d25-bdbc-446f-a7b9-de3696866c07-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.918959 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.918926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa50d25-bdbc-446f-a7b9-de3696866c07-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.918959 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.918978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwt58\" (UniqueName: \"kubernetes.io/projected/2fa50d25-bdbc-446f-a7b9-de3696866c07-kube-api-access-pwt58\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.919189 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.919001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa50d25-bdbc-446f-a7b9-de3696866c07-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.919488 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.919461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa50d25-bdbc-446f-a7b9-de3696866c07-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.921170 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.921149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa50d25-bdbc-446f-a7b9-de3696866c07-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:49.927285 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:49.927260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwt58\" (UniqueName: \"kubernetes.io/projected/2fa50d25-bdbc-446f-a7b9-de3696866c07-kube-api-access-pwt58\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnh4q\" (UID: \"2fa50d25-bdbc-446f-a7b9-de3696866c07\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:50.029899 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:50.029810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" Apr 28 19:18:50.144039 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:50.144017 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q"] Apr 28 19:18:50.146421 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:18:50.146393 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa50d25_bdbc_446f_a7b9_de3696866c07.slice/crio-ce036cbff246c22e145c8ede439e09f5e815aa2a0c3f3ccf902497228752a697 WatchSource:0}: Error finding container ce036cbff246c22e145c8ede439e09f5e815aa2a0c3f3ccf902497228752a697: Status 404 returned error can't find the container with id ce036cbff246c22e145c8ede439e09f5e815aa2a0c3f3ccf902497228752a697 Apr 28 19:18:50.564727 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:50.564689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" event={"ID":"2fa50d25-bdbc-446f-a7b9-de3696866c07","Type":"ContainerStarted","Data":"ce036cbff246c22e145c8ede439e09f5e815aa2a0c3f3ccf902497228752a697"} Apr 28 19:18:52.571135 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:52.571094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" event={"ID":"2fa50d25-bdbc-446f-a7b9-de3696866c07","Type":"ContainerStarted","Data":"60b2dee7fc5e3b62874e03965bed62485628021664851814083e6e0e1ee59919"} Apr 28 19:18:52.604754 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:52.604700 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" podStartSLOduration=1.794759235 podStartE2EDuration="3.604684538s" podCreationTimestamp="2026-04-28 19:18:49 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.148238878 +0000 UTC m=+149.575739473" lastFinishedPulling="2026-04-28 19:18:51.958164177 +0000 UTC m=+151.385664776" observedRunningTime="2026-04-28 19:18:52.604002408 +0000 UTC m=+152.031503046" watchObservedRunningTime="2026-04-28 19:18:52.604684538 +0000 UTC m=+152.032185158" Apr 28 19:18:56.219510 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.219482 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rm7p_eccda02a-118f-4d0d-858f-6d03050f92de/dns-node-resolver/0.log" Apr 28 19:18:56.506234 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:18:56.506108 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7826r" podUID="f72b413b-cc77-44a9-9bab-db02d985886b" Apr 28 19:18:56.520341 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:18:56.520307 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wl2q4" podUID="e361f8d4-ac75-4139-b8c7-fada2556f305" Apr 28 19:18:56.556161 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.556133 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-n66w9"] Apr 28 19:18:56.559083 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.559068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.561849 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.561830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 28 19:18:56.562121 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.562107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 28 19:18:56.562951 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.562934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 28 19:18:56.563048 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.562936 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 28 19:18:56.563237 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.563223 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-fkgcw\"" Apr 28 19:18:56.565395 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.565253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9gx6\" (UniqueName: \"kubernetes.io/projected/bd10fa57-ea8e-48b4-9997-bb141880e38d-kube-api-access-q9gx6\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.565395 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.565326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-cabundle\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.565520 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.565406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-key\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.570934 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.570912 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-n66w9"] Apr 28 19:18:56.584150 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.584126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:18:56.584289 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.584154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7826r" Apr 28 19:18:56.666738 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.666705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-cabundle\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.666922 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.666754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-key\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.666922 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.666850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9gx6\" (UniqueName: \"kubernetes.io/projected/bd10fa57-ea8e-48b4-9997-bb141880e38d-kube-api-access-q9gx6\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.667440 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.667420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-cabundle\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.669160 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.669142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd10fa57-ea8e-48b4-9997-bb141880e38d-signing-key\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.675727 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.675704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9gx6\" (UniqueName: \"kubernetes.io/projected/bd10fa57-ea8e-48b4-9997-bb141880e38d-kube-api-access-q9gx6\") pod \"service-ca-865cb79987-n66w9\" (UID: \"bd10fa57-ea8e-48b4-9997-bb141880e38d\") " pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.867766 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.867729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-n66w9" Apr 28 19:18:56.982114 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:56.982081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-n66w9"] Apr 28 19:18:56.985692 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:18:56.985658 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd10fa57_ea8e_48b4_9997_bb141880e38d.slice/crio-5167256e65a4d995d4825077d2fb154d76cfa2f4670400150358cbf3ed1e9120 WatchSource:0}: Error finding container 5167256e65a4d995d4825077d2fb154d76cfa2f4670400150358cbf3ed1e9120: Status 404 returned error can't find the container with id 5167256e65a4d995d4825077d2fb154d76cfa2f4670400150358cbf3ed1e9120 Apr 28 19:18:57.222666 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:57.222569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4clws_f5dbdaee-eba8-485f-88bf-e781ddd5de4d/node-ca/0.log" Apr 28 19:18:57.588074 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:57.587989 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-n66w9" event={"ID":"bd10fa57-ea8e-48b4-9997-bb141880e38d","Type":"ContainerStarted","Data":"5167256e65a4d995d4825077d2fb154d76cfa2f4670400150358cbf3ed1e9120"} Apr 28 19:18:58.183658 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:18:58.183610 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ggtvc" podUID="0a2d2c94-3731-45d6-be85-14a1a081468a" Apr 28 19:18:59.594140 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:59.594103 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-n66w9" event={"ID":"bd10fa57-ea8e-48b4-9997-bb141880e38d","Type":"ContainerStarted","Data":"ad09fa3bbafb9adb122326c22aa4d322b996eeed14251d532b634a3ff1eff16b"} Apr 28 19:18:59.612312 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:18:59.612266 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-n66w9" podStartSLOduration=1.812527759 podStartE2EDuration="3.612251964s" podCreationTimestamp="2026-04-28 19:18:56 +0000 UTC" firstStartedPulling="2026-04-28 19:18:56.987638048 +0000 UTC m=+156.415138642" lastFinishedPulling="2026-04-28 19:18:58.787362252 +0000 UTC m=+158.214862847" observedRunningTime="2026-04-28 19:18:59.611872153 +0000 UTC m=+159.039372771" watchObservedRunningTime="2026-04-28 19:18:59.612251964 +0000 UTC m=+159.039752626" Apr 28 19:19:01.402795 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:01.402764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:19:01.403287 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:01.402811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:19:01.403287 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:01.402852 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:19:01.403287 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:01.402909 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:19:01.403287 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:01.402916 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls podName:f72b413b-cc77-44a9-9bab-db02d985886b nodeName:}" failed. No retries permitted until 2026-04-28 19:21:03.402899692 +0000 UTC m=+282.830400291 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls") pod "dns-default-7826r" (UID: "f72b413b-cc77-44a9-9bab-db02d985886b") : secret "dns-default-metrics-tls" not found Apr 28 19:19:01.403287 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:01.402971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert podName:e361f8d4-ac75-4139-b8c7-fada2556f305 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:03.402956123 +0000 UTC m=+282.830456732 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert") pod "ingress-canary-wl2q4" (UID: "e361f8d4-ac75-4139-b8c7-fada2556f305") : secret "canary-serving-cert" not found Apr 28 19:19:13.163228 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:13.163130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:19:17.618338 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.618304 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gtt2h"] Apr 28 19:19:17.621164 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.621147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.623423 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.623402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:19:17.623566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.623482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:19:17.624496 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.624480 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-v5snm\"" Apr 28 19:19:17.624591 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.624505 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:19:17.624591 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.624565 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:19:17.648392 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.648361 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gtt2h"] Apr 28 19:19:17.726980 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.726939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27m9w\" (UniqueName: \"kubernetes.io/projected/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-api-access-27m9w\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.727145 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.726993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-data-volume\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.727145 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.727054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.727145 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.727089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.727275 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.727145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-crio-socket\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827630 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-data-volume\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827630 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827835 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827835 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-crio-socket\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827910 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27m9w\" (UniqueName: \"kubernetes.io/projected/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-api-access-27m9w\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827910 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-crio-socket\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.827976 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.827951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-data-volume\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.828870 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.828851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.830703 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.830683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.838157 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.838133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27m9w\" (UniqueName: \"kubernetes.io/projected/5b4cd600-56e7-44e5-a9fb-8d0f722d19ec-kube-api-access-27m9w\") pod \"insights-runtime-extractor-gtt2h\" (UID: \"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec\") " pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:17.929175 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:17.929085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gtt2h" Apr 28 19:19:18.049176 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:18.049144 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gtt2h"] Apr 28 19:19:18.052428 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:18.052400 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b4cd600_56e7_44e5_a9fb_8d0f722d19ec.slice/crio-97e9d1acf30e31f42e8911e5d174c56e417629a535aa3a66cbd2a082751b8a66 WatchSource:0}: Error finding container 97e9d1acf30e31f42e8911e5d174c56e417629a535aa3a66cbd2a082751b8a66: Status 404 returned error can't find the container with id 97e9d1acf30e31f42e8911e5d174c56e417629a535aa3a66cbd2a082751b8a66 Apr 28 19:19:18.641615 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:18.641589 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtt2h" event={"ID":"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec","Type":"ContainerStarted","Data":"65f9d191fa5d94c40684b153f49dab653d8ce4cd61e4086d3e428bd2505edc5e"} Apr 28 19:19:18.641930 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:18.641666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtt2h" event={"ID":"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec","Type":"ContainerStarted","Data":"97e9d1acf30e31f42e8911e5d174c56e417629a535aa3a66cbd2a082751b8a66"} Apr 28 19:19:19.645509 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:19.645459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtt2h" event={"ID":"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec","Type":"ContainerStarted","Data":"0809191db97a968df03e59e7134e4c3d3e1514fdc7f9ae4480f00ec1228c6507"} Apr 28 19:19:20.649804 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:20.649760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gtt2h" event={"ID":"5b4cd600-56e7-44e5-a9fb-8d0f722d19ec","Type":"ContainerStarted","Data":"eba3758ec93d193a2b2107f287503f10e058609e9a445bf7c3dc069c0b219807"} Apr 28 19:19:20.679743 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:20.679690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gtt2h" podStartSLOduration=1.8330128060000002 podStartE2EDuration="3.679675914s" podCreationTimestamp="2026-04-28 19:19:17 +0000 UTC" firstStartedPulling="2026-04-28 19:19:18.119333742 +0000 UTC m=+177.546834340" lastFinishedPulling="2026-04-28 19:19:19.965996849 +0000 UTC m=+179.393497448" observedRunningTime="2026-04-28 19:19:20.678850414 +0000 UTC m=+180.106351031" watchObservedRunningTime="2026-04-28 19:19:20.679675914 +0000 UTC m=+180.107176530" Apr 28 19:19:24.587722 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.587688 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9"] Apr 28 19:19:24.590601 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.590584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:24.592998 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.592974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 28 19:19:24.593134 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.593066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-vlqnt\"" Apr 28 19:19:24.603330 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.603308 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9"] Apr 28 19:19:24.680039 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.679999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/323061f0-0771-44a0-8824-36b3911ce376-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-v56c9\" (UID: \"323061f0-0771-44a0-8824-36b3911ce376\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:24.780644 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.780606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/323061f0-0771-44a0-8824-36b3911ce376-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-v56c9\" (UID: \"323061f0-0771-44a0-8824-36b3911ce376\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:24.783526 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.783507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/323061f0-0771-44a0-8824-36b3911ce376-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-v56c9\" (UID: \"323061f0-0771-44a0-8824-36b3911ce376\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:24.899415 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:24.899368 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:25.025808 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:25.025777 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9"] Apr 28 19:19:25.028340 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:25.028313 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod323061f0_0771_44a0_8824_36b3911ce376.slice/crio-87df55496d77dd6f97c80b64992e363a38c116d9e75e250b68f20c8ea53a36f7 WatchSource:0}: Error finding container 87df55496d77dd6f97c80b64992e363a38c116d9e75e250b68f20c8ea53a36f7: Status 404 returned error can't find the container with id 87df55496d77dd6f97c80b64992e363a38c116d9e75e250b68f20c8ea53a36f7 Apr 28 19:19:25.663687 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:25.663643 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" event={"ID":"323061f0-0771-44a0-8824-36b3911ce376","Type":"ContainerStarted","Data":"87df55496d77dd6f97c80b64992e363a38c116d9e75e250b68f20c8ea53a36f7"} Apr 28 19:19:26.667191 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:26.667157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" event={"ID":"323061f0-0771-44a0-8824-36b3911ce376","Type":"ContainerStarted","Data":"6cca82db85781f73488f943f7717d7a9bba6e6e5dadc757f1190c820a9cf2de2"} Apr 28 19:19:26.667571 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:26.667360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:26.671956 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:26.671933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" Apr 28 19:19:26.691671 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:26.691628 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-v56c9" podStartSLOduration=1.687479437 podStartE2EDuration="2.691613877s" podCreationTimestamp="2026-04-28 19:19:24 +0000 UTC" firstStartedPulling="2026-04-28 19:19:25.030046357 +0000 UTC m=+184.457546953" lastFinishedPulling="2026-04-28 19:19:26.034180783 +0000 UTC m=+185.461681393" observedRunningTime="2026-04-28 19:19:26.691503375 +0000 UTC m=+186.119003991" watchObservedRunningTime="2026-04-28 19:19:26.691613877 +0000 UTC m=+186.119114495" Apr 28 19:19:27.679912 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.679880 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kw5gw"] Apr 28 19:19:27.682794 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.682778 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.685647 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.685623 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 28 19:19:27.685647 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.685643 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:19:27.687451 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.687432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-2gqbl\"" Apr 28 19:19:27.687554 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.687452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:19:27.687554 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.687432 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 28 19:19:27.687721 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.687704 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:19:27.696010 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.695986 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kw5gw"] Apr 28 19:19:27.803076 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.803038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872a84d5-7590-4826-b51a-d7ad968aa650-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.803299 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.803119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.803299 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.803157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.803394 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.803306 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7hf\" (UniqueName: \"kubernetes.io/projected/872a84d5-7590-4826-b51a-d7ad968aa650-kube-api-access-wc7hf\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.904026 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.903988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872a84d5-7590-4826-b51a-d7ad968aa650-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.904262 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.904040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.904262 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.904138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.904262 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.904191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7hf\" (UniqueName: \"kubernetes.io/projected/872a84d5-7590-4826-b51a-d7ad968aa650-kube-api-access-wc7hf\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.904752 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.904729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/872a84d5-7590-4826-b51a-d7ad968aa650-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.906729 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.906707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.906795 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.906717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/872a84d5-7590-4826-b51a-d7ad968aa650-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.914360 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.914335 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7hf\" (UniqueName: \"kubernetes.io/projected/872a84d5-7590-4826-b51a-d7ad968aa650-kube-api-access-wc7hf\") pod \"prometheus-operator-5676c8c784-kw5gw\" (UID: \"872a84d5-7590-4826-b51a-d7ad968aa650\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:27.991925 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:27.991832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" Apr 28 19:19:28.117539 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:28.117507 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kw5gw"] Apr 28 19:19:28.120334 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:28.120311 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872a84d5_7590_4826_b51a_d7ad968aa650.slice/crio-dad3ef43a88ad718cbc1a550c7313f97ff609886cf72a98638f54a29bd33cf45 WatchSource:0}: Error finding container dad3ef43a88ad718cbc1a550c7313f97ff609886cf72a98638f54a29bd33cf45: Status 404 returned error can't find the container with id dad3ef43a88ad718cbc1a550c7313f97ff609886cf72a98638f54a29bd33cf45 Apr 28 19:19:28.672563 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:28.672525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" event={"ID":"872a84d5-7590-4826-b51a-d7ad968aa650","Type":"ContainerStarted","Data":"dad3ef43a88ad718cbc1a550c7313f97ff609886cf72a98638f54a29bd33cf45"} Apr 28 19:19:29.677241 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:29.677192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" event={"ID":"872a84d5-7590-4826-b51a-d7ad968aa650","Type":"ContainerStarted","Data":"9edb44c14b3ed1f67942cea81e1fdbfa20f03ee40f979f14bd98eadbfadd765e"} Apr 28 19:19:29.677241 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:29.677245 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" event={"ID":"872a84d5-7590-4826-b51a-d7ad968aa650","Type":"ContainerStarted","Data":"41db52741f2788d83b0674ef6972938239adc3e65426e7231e2dfb6570939615"} Apr 28 19:19:29.717577 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:29.717466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kw5gw" podStartSLOduration=1.393614808 podStartE2EDuration="2.717449402s" podCreationTimestamp="2026-04-28 19:19:27 +0000 UTC" firstStartedPulling="2026-04-28 19:19:28.122121201 +0000 UTC m=+187.549621796" lastFinishedPulling="2026-04-28 19:19:29.445955792 +0000 UTC m=+188.873456390" observedRunningTime="2026-04-28 19:19:29.716484589 +0000 UTC m=+189.143985210" watchObservedRunningTime="2026-04-28 19:19:29.717449402 +0000 UTC m=+189.144950018" Apr 28 19:19:32.061511 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.061471 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl"] Apr 28 19:19:32.064574 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.064555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.068047 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.068020 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:19:32.068233 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.068195 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-d6t6b\"" Apr 28 19:19:32.068803 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.068786 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 28 19:19:32.081271 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.081246 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl"] Apr 28 19:19:32.117159 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.117126 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-746l7"] Apr 28 19:19:32.119841 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.119818 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.122126 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.122089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:19:32.122304 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.122286 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4l8px\"" Apr 28 19:19:32.122893 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.122875 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:19:32.122990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.122902 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:19:32.139282 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.139239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee133af6-93fa-4948-8dec-4878fa4c34b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.139439 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.139375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.139439 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.139433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xgb\" (UniqueName: \"kubernetes.io/projected/ee133af6-93fa-4948-8dec-4878fa4c34b4-kube-api-access-n2xgb\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.139557 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.139467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.240385 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.240566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xgb\" (UniqueName: \"kubernetes.io/projected/ee133af6-93fa-4948-8dec-4878fa4c34b4-kube-api-access-n2xgb\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.240566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-root\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240566 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240558 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-sys\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-metrics-client-ca\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-accelerators-collector-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-textfile\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jjg\" (UniqueName: \"kubernetes.io/projected/1b52ffce-09df-44f4-8b49-23d1c579a285-kube-api-access-v4jjg\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.240850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee133af6-93fa-4948-8dec-4878fa4c34b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.241151 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.240872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-wtmp\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.241510 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.241487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee133af6-93fa-4948-8dec-4878fa4c34b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.242984 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.242961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.243313 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.243292 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee133af6-93fa-4948-8dec-4878fa4c34b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.252542 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.252510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xgb\" (UniqueName: \"kubernetes.io/projected/ee133af6-93fa-4948-8dec-4878fa4c34b4-kube-api-access-n2xgb\") pod \"openshift-state-metrics-9d44df66c-kwqdl\" (UID: \"ee133af6-93fa-4948-8dec-4878fa4c34b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.342269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342269 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-root\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-sys\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-metrics-client-ca\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-accelerators-collector-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-textfile\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:32.342342 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-root\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jjg\" (UniqueName: \"kubernetes.io/projected/1b52ffce-09df-44f4-8b49-23d1c579a285-kube-api-access-v4jjg\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-sys\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342550 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:19:32.342418 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls podName:1b52ffce-09df-44f4-8b49-23d1c579a285 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:32.842395345 +0000 UTC m=+192.269895945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls") pod "node-exporter-746l7" (UID: "1b52ffce-09df-44f4-8b49-23d1c579a285") : secret "node-exporter-tls" not found Apr 28 19:19:32.342920 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-wtmp\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342920 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-textfile\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.342920 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.342784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-wtmp\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.343143 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.343120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-accelerators-collector-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.343482 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.343464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52ffce-09df-44f4-8b49-23d1c579a285-metrics-client-ca\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.344604 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.344582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.354255 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.354182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jjg\" (UniqueName: \"kubernetes.io/projected/1b52ffce-09df-44f4-8b49-23d1c579a285-kube-api-access-v4jjg\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.374677 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.374643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" Apr 28 19:19:32.511688 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.511652 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl"] Apr 28 19:19:32.521606 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:32.521562 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee133af6_93fa_4948_8dec_4878fa4c34b4.slice/crio-80063e0cfc6f195cd244e7413349fffd5afe9bb2699421e553c5ba79f791a422 WatchSource:0}: Error finding container 80063e0cfc6f195cd244e7413349fffd5afe9bb2699421e553c5ba79f791a422: Status 404 returned error can't find the container with id 80063e0cfc6f195cd244e7413349fffd5afe9bb2699421e553c5ba79f791a422 Apr 28 19:19:32.686717 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.686682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" event={"ID":"ee133af6-93fa-4948-8dec-4878fa4c34b4","Type":"ContainerStarted","Data":"ab84c20201782cd0a3c2930c8031cfb49b0e1ec7e042de25c7675fae6e53e585"} Apr 28 19:19:32.686717 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.686719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" event={"ID":"ee133af6-93fa-4948-8dec-4878fa4c34b4","Type":"ContainerStarted","Data":"9bad07b89d1d5f7afc2d33bbc699c5938e9f0b35130c35da147703e351a7299f"} Apr 28 19:19:32.686905 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.686729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" event={"ID":"ee133af6-93fa-4948-8dec-4878fa4c34b4","Type":"ContainerStarted","Data":"80063e0cfc6f195cd244e7413349fffd5afe9bb2699421e553c5ba79f791a422"} Apr 28 19:19:32.846977 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.846891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:32.849174 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:32.849153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1b52ffce-09df-44f4-8b49-23d1c579a285-node-exporter-tls\") pod \"node-exporter-746l7\" (UID: \"1b52ffce-09df-44f4-8b49-23d1c579a285\") " pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:33.030575 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:33.030540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-746l7" Apr 28 19:19:33.038277 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:33.038249 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b52ffce_09df_44f4_8b49_23d1c579a285.slice/crio-26629ea04cf16928210711c3bfb2376d24226f0a7b51675a0981da130f73478d WatchSource:0}: Error finding container 26629ea04cf16928210711c3bfb2376d24226f0a7b51675a0981da130f73478d: Status 404 returned error can't find the container with id 26629ea04cf16928210711c3bfb2376d24226f0a7b51675a0981da130f73478d Apr 28 19:19:33.690978 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:33.690943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-746l7" event={"ID":"1b52ffce-09df-44f4-8b49-23d1c579a285","Type":"ContainerStarted","Data":"26629ea04cf16928210711c3bfb2376d24226f0a7b51675a0981da130f73478d"} Apr 28 19:19:33.692709 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:33.692687 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" event={"ID":"ee133af6-93fa-4948-8dec-4878fa4c34b4","Type":"ContainerStarted","Data":"af914dacc01c933a7065aa894e63ce23d7f93594238bdd0fc53ba189ba840b5c"} Apr 28 19:19:33.717334 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:33.717284 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kwqdl" podStartSLOduration=0.858126134 podStartE2EDuration="1.717268362s" podCreationTimestamp="2026-04-28 19:19:32 +0000 UTC" firstStartedPulling="2026-04-28 19:19:32.65196119 +0000 UTC m=+192.079461785" lastFinishedPulling="2026-04-28 19:19:33.511103407 +0000 UTC m=+192.938604013" observedRunningTime="2026-04-28 19:19:33.715653757 +0000 UTC m=+193.143154375" watchObservedRunningTime="2026-04-28 19:19:33.717268362 +0000 UTC m=+193.144768978" Apr 28 19:19:34.696193 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:34.696155 2572 generic.go:358] "Generic (PLEG): container finished" podID="1b52ffce-09df-44f4-8b49-23d1c579a285" containerID="2dbbdf735bfe0c4f2427434b6993c5a38e378d0303bfb6dcfbb50fddef7b0232" exitCode=0 Apr 28 19:19:34.696641 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:34.696236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-746l7" event={"ID":"1b52ffce-09df-44f4-8b49-23d1c579a285","Type":"ContainerDied","Data":"2dbbdf735bfe0c4f2427434b6993c5a38e378d0303bfb6dcfbb50fddef7b0232"} Apr 28 19:19:35.337804 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.337776 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-545c89fc46-7w7rl"] Apr 28 19:19:35.340398 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.340381 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.346763 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.346741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 28 19:19:35.347587 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.347567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 28 19:19:35.347746 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.347730 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 28 19:19:35.348578 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.348561 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-90i3sr26q3e8k\"" Apr 28 19:19:35.348674 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.348578 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-dv6tx\"" Apr 28 19:19:35.348674 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.348612 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 28 19:19:35.348785 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.348616 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 28 19:19:35.371574 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.371534 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-545c89fc46-7w7rl"] Apr 28 19:19:35.470657 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470621 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.470850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.470850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470714 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dac9353-9a3d-43e4-9922-67b6fec37c8e-metrics-client-ca\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.470850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thn9z\" (UniqueName: \"kubernetes.io/projected/1dac9353-9a3d-43e4-9922-67b6fec37c8e-kube-api-access-thn9z\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.470850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-grpc-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.470850 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.471076 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.471076 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.470889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.571989 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.571950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572166 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572166 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572301 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dac9353-9a3d-43e4-9922-67b6fec37c8e-metrics-client-ca\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572301 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thn9z\" (UniqueName: \"kubernetes.io/projected/1dac9353-9a3d-43e4-9922-67b6fec37c8e-kube-api-access-thn9z\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572301 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-grpc-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572461 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.572461 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.573063 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.572983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1dac9353-9a3d-43e4-9922-67b6fec37c8e-metrics-client-ca\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.574811 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.574744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.574928 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.574838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.575082 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.575065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.575245 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.575223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.575363 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.575343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-grpc-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.575603 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.575586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1dac9353-9a3d-43e4-9922-67b6fec37c8e-secret-thanos-querier-tls\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.581540 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.581518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thn9z\" (UniqueName: \"kubernetes.io/projected/1dac9353-9a3d-43e4-9922-67b6fec37c8e-kube-api-access-thn9z\") pod \"thanos-querier-545c89fc46-7w7rl\" (UID: \"1dac9353-9a3d-43e4-9922-67b6fec37c8e\") " pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.649233 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.649181 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:35.702645 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.702573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-746l7" event={"ID":"1b52ffce-09df-44f4-8b49-23d1c579a285","Type":"ContainerStarted","Data":"d30588bcf9f616da23412c7bbe230af5b35782145e6000c00b391afb4ffc64d9"} Apr 28 19:19:35.702645 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.702622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-746l7" event={"ID":"1b52ffce-09df-44f4-8b49-23d1c579a285","Type":"ContainerStarted","Data":"c58929cba3a0b5386416d73906e15e628321b65dbd23ed4e7002eca8328b9bee"} Apr 28 19:19:35.728545 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.728467 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-746l7" podStartSLOduration=2.924796883 podStartE2EDuration="3.728450447s" podCreationTimestamp="2026-04-28 19:19:32 +0000 UTC" firstStartedPulling="2026-04-28 19:19:33.039868563 +0000 UTC m=+192.467369158" lastFinishedPulling="2026-04-28 19:19:33.843522121 +0000 UTC m=+193.271022722" observedRunningTime="2026-04-28 19:19:35.727398276 +0000 UTC m=+195.154898893" watchObservedRunningTime="2026-04-28 19:19:35.728450447 +0000 UTC m=+195.155951064" Apr 28 19:19:35.784121 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:35.784097 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-545c89fc46-7w7rl"] Apr 28 19:19:35.785972 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:35.785940 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dac9353_9a3d_43e4_9922_67b6fec37c8e.slice/crio-c9768ff510bd85ff07fda55ca7b4bd90dfc0d3b9233755413c121d926d8d43c9 WatchSource:0}: Error finding container c9768ff510bd85ff07fda55ca7b4bd90dfc0d3b9233755413c121d926d8d43c9: Status 404 returned error can't find the container with id c9768ff510bd85ff07fda55ca7b4bd90dfc0d3b9233755413c121d926d8d43c9 Apr 28 19:19:36.709898 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:36.709863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"c9768ff510bd85ff07fda55ca7b4bd90dfc0d3b9233755413c121d926d8d43c9"} Apr 28 19:19:37.715063 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:37.715032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"605b2147352e3b0a66bb58a6f7bef5e2c3f309a859005927fc75cd2686e6385a"} Apr 28 19:19:38.018093 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.018014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:19:38.020215 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.020188 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.024891 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.024868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:19:38.025892 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.025877 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:19:38.025971 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.025957 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:19:38.026023 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.026001 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:19:38.026088 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.026005 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:19:38.026193 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.026163 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-c9kx6\"" Apr 28 19:19:38.026324 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.026283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:19:38.026324 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.026318 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:19:38.030905 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.030888 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:19:38.052044 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.052017 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:19:38.094539 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094539 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094747 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094747 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52q7\" (UniqueName: \"kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094747 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094855 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094751 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.094855 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.094777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.195666 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.195627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.195837 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.195679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.195837 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.195713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.195962 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.195931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w52q7\" (UniqueName: \"kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196445 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196445 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196445 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196445 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196787 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.196851 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.196812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.197518 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.197495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.198755 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.198732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.198950 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.198936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.206248 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.206223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52q7\" (UniqueName: \"kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7\") pod \"console-6c458bfb7c-x754d\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.330038 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.329944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:38.627300 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.627266 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:19:38.630582 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:38.630557 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755fc6b7_9049_4c46_a58d_f874ccbd4f1e.slice/crio-df8646ef7ab21637d71d64ffd6fac45cbe5097e1fb5377a619dfa9af9f4e7279 WatchSource:0}: Error finding container df8646ef7ab21637d71d64ffd6fac45cbe5097e1fb5377a619dfa9af9f4e7279: Status 404 returned error can't find the container with id df8646ef7ab21637d71d64ffd6fac45cbe5097e1fb5377a619dfa9af9f4e7279 Apr 28 19:19:38.677419 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.677387 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:19:38.681715 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.680404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.683655 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.683567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 28 19:19:38.685429 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.685279 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 28 19:19:38.685429 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.685298 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 28 19:19:38.685429 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.685284 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 28 19:19:38.685429 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.685402 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2jd7j\"" Apr 28 19:19:38.695802 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.695778 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 28 19:19:38.695916 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.695889 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 28 19:19:38.695985 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.695961 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 28 19:19:38.696042 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.695987 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 28 19:19:38.696093 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.696045 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:19:38.696451 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.696410 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 28 19:19:38.698694 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.698670 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 28 19:19:38.702227 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.701954 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8o7mo15lug7ie\"" Apr 28 19:19:38.717659 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.717634 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 28 19:19:38.718026 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.717667 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 28 19:19:38.720523 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720499 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"66c75454129f0f18931965f66cc49c7188cc1cdceac4353cd1617e6b96e7dffe"} Apr 28 19:19:38.720622 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"dcc364278681673983e8de5131712776e75e615da5321cb358918f09545983e4"} Apr 28 19:19:38.720622 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"0cad57221a39757aae6730a93f8a2360d73d6d3eb88e75e63f7af21255eace75"} Apr 28 19:19:38.720622 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"ddd64f386954e07d4250e4ef80ad62bd2132e6ac3f158cb0d44f15aaeaedbc6a"} Apr 28 19:19:38.720622 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720575 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" event={"ID":"1dac9353-9a3d-43e4-9922-67b6fec37c8e","Type":"ContainerStarted","Data":"427e6c828f18d90e897bce89feb5dc35c488b69a7fae688d4acca00ac4325372"} Apr 28 19:19:38.720799 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.720669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:38.721552 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.721535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c458bfb7c-x754d" event={"ID":"755fc6b7-9049-4c46-a58d-f874ccbd4f1e","Type":"ContainerStarted","Data":"df8646ef7ab21637d71d64ffd6fac45cbe5097e1fb5377a619dfa9af9f4e7279"} Apr 28 19:19:38.738421 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.738345 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:19:38.801756 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801756 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801759 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801784 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vgc\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-kube-api-access-h5vgc\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.801990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.801974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802117 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802133 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802234 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.802686 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.802562 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.813626 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.813579 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" podStartSLOduration=1.107432934 podStartE2EDuration="3.813567366s" podCreationTimestamp="2026-04-28 19:19:35 +0000 UTC" firstStartedPulling="2026-04-28 19:19:35.787952751 +0000 UTC m=+195.215453346" lastFinishedPulling="2026-04-28 19:19:38.49408717 +0000 UTC m=+197.921587778" observedRunningTime="2026-04-28 19:19:38.813530411 +0000 UTC m=+198.241031027" watchObservedRunningTime="2026-04-28 19:19:38.813567366 +0000 UTC m=+198.241067983" Apr 28 19:19:38.903177 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903177 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903415 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903415 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vgc\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-kube-api-access-h5vgc\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903415 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903530 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903585 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.903821 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.904108 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.904108 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.903891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.904976 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.904395 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.904976 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.904405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.904976 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.904625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.906472 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.906449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-web-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.907705 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.906989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.907705 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.907289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.907705 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.907454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.907705 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.907553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.907971 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.907883 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.908222 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.908166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/314de588-071e-4fe6-a2cc-62cae8f6e9a7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.908318 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.908265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.908396 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.908361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.908831 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.908806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.909057 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.909033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.909276 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.909257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.909693 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.909671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/314de588-071e-4fe6-a2cc-62cae8f6e9a7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.909817 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.909798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/314de588-071e-4fe6-a2cc-62cae8f6e9a7-config-out\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.924057 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.924028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vgc\" (UniqueName: \"kubernetes.io/projected/314de588-071e-4fe6-a2cc-62cae8f6e9a7-kube-api-access-h5vgc\") pod \"prometheus-k8s-0\" (UID: \"314de588-071e-4fe6-a2cc-62cae8f6e9a7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:38.993502 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:38.993465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:19:39.157990 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:39.157904 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 28 19:19:39.161326 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:19:39.161293 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314de588_071e_4fe6_a2cc_62cae8f6e9a7.slice/crio-2a237292b8afb56faaf28e023da9b34b9e7ff77167be5ad7a3ef3924ace4c6df WatchSource:0}: Error finding container 2a237292b8afb56faaf28e023da9b34b9e7ff77167be5ad7a3ef3924ace4c6df: Status 404 returned error can't find the container with id 2a237292b8afb56faaf28e023da9b34b9e7ff77167be5ad7a3ef3924ace4c6df Apr 28 19:19:39.726720 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:39.726656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"2a237292b8afb56faaf28e023da9b34b9e7ff77167be5ad7a3ef3924ace4c6df"} Apr 28 19:19:41.734280 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:41.734241 2572 generic.go:358] "Generic (PLEG): container finished" podID="314de588-071e-4fe6-a2cc-62cae8f6e9a7" containerID="466746c61675ba86e045100cd7d1399dd2323afd65b5e2e86e262bd01024edde" exitCode=0 Apr 28 19:19:41.734694 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:41.734321 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerDied","Data":"466746c61675ba86e045100cd7d1399dd2323afd65b5e2e86e262bd01024edde"} Apr 28 19:19:41.735684 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:41.735660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c458bfb7c-x754d" event={"ID":"755fc6b7-9049-4c46-a58d-f874ccbd4f1e","Type":"ContainerStarted","Data":"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327"} Apr 28 19:19:41.814020 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:41.813964 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c458bfb7c-x754d" podStartSLOduration=2.186441213 podStartE2EDuration="4.813949039s" podCreationTimestamp="2026-04-28 19:19:37 +0000 UTC" firstStartedPulling="2026-04-28 19:19:38.632930123 +0000 UTC m=+198.060430725" lastFinishedPulling="2026-04-28 19:19:41.260437942 +0000 UTC m=+200.687938551" observedRunningTime="2026-04-28 19:19:41.812818628 +0000 UTC m=+201.240319244" watchObservedRunningTime="2026-04-28 19:19:41.813949039 +0000 UTC m=+201.241449656" Apr 28 19:19:44.734530 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:44.734500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-545c89fc46-7w7rl" Apr 28 19:19:45.763016 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:45.761673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"208b71fe97fd7b054290963a3acf37f51c59a2e4147728b32dc56a638440ca61"} Apr 28 19:19:45.763016 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:45.761716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"af71e037819a0fce7e7881ee85a028d38584eeff36ee781c7b47802e48b61881"} Apr 28 19:19:45.763016 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:45.761730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"278ca7961227921830138757731770d462803b946ac4a817a4cc6e65eee1f3f4"} Apr 28 19:19:46.767041 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:46.767004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"ba52ff1478425bfb2ad303286616d9bf102bc81cf2ecf9fa59e2bae4860d3234"} Apr 28 19:19:46.767041 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:46.767046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"8f79a19a7807afa244b132735b1a65680529085cf5f50053b5b8d1354dae61e2"} Apr 28 19:19:46.767459 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:46.767060 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"314de588-071e-4fe6-a2cc-62cae8f6e9a7","Type":"ContainerStarted","Data":"e2725d0f01d5332412acd18fde9817da88f27016680c85ff9db8d7194215f164"} Apr 28 19:19:46.804476 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:46.804402 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.483070668 podStartE2EDuration="8.8043795s" podCreationTimestamp="2026-04-28 19:19:38 +0000 UTC" firstStartedPulling="2026-04-28 19:19:39.163726469 +0000 UTC m=+198.591227075" lastFinishedPulling="2026-04-28 19:19:45.485035309 +0000 UTC m=+204.912535907" observedRunningTime="2026-04-28 19:19:46.802669766 +0000 UTC m=+206.230170386" watchObservedRunningTime="2026-04-28 19:19:46.8043795 +0000 UTC m=+206.231880119" Apr 28 19:19:48.331045 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:48.331001 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:48.331045 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:48.331050 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:48.335727 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:48.335705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:48.776130 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:48.776095 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:19:48.994290 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:19:48.994245 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:02.813484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:02.813393 2572 generic.go:358] "Generic (PLEG): container finished" podID="2fa50d25-bdbc-446f-a7b9-de3696866c07" containerID="60b2dee7fc5e3b62874e03965bed62485628021664851814083e6e0e1ee59919" exitCode=0 Apr 28 19:20:02.813484 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:02.813464 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" event={"ID":"2fa50d25-bdbc-446f-a7b9-de3696866c07","Type":"ContainerDied","Data":"60b2dee7fc5e3b62874e03965bed62485628021664851814083e6e0e1ee59919"} Apr 28 19:20:02.813913 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:02.813785 2572 scope.go:117] "RemoveContainer" containerID="60b2dee7fc5e3b62874e03965bed62485628021664851814083e6e0e1ee59919" Apr 28 19:20:03.818061 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:03.818028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnh4q" event={"ID":"2fa50d25-bdbc-446f-a7b9-de3696866c07","Type":"ContainerStarted","Data":"0f0b22f976138d84ad6961bee4bccc6392fb1f211db390c706e282e94488ba56"} Apr 28 19:20:04.650130 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:04.650069 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" podUID="faf72846-cb03-4e4c-986f-947dea5fb0fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:10.015556 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:10.015518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rm7p_eccda02a-118f-4d0d-858f-6d03050f92de/dns-node-resolver/0.log" Apr 28 19:20:14.650047 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:14.649995 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" podUID="faf72846-cb03-4e4c-986f-947dea5fb0fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:24.649734 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.649695 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" podUID="faf72846-cb03-4e4c-986f-947dea5fb0fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:20:24.650097 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.649765 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" Apr 28 19:20:24.650272 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.650241 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e390ef80604c2b4d250045bc0c66b4b78d73ff58140fba650da24107120868a5"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 28 19:20:24.650314 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.650303 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" podUID="faf72846-cb03-4e4c-986f-947dea5fb0fb" containerName="service-proxy" containerID="cri-o://e390ef80604c2b4d250045bc0c66b4b78d73ff58140fba650da24107120868a5" gracePeriod=30 Apr 28 19:20:24.880557 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.880520 2572 generic.go:358] "Generic (PLEG): container finished" podID="faf72846-cb03-4e4c-986f-947dea5fb0fb" containerID="e390ef80604c2b4d250045bc0c66b4b78d73ff58140fba650da24107120868a5" exitCode=2 Apr 28 19:20:24.880557 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.880556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerDied","Data":"e390ef80604c2b4d250045bc0c66b4b78d73ff58140fba650da24107120868a5"} Apr 28 19:20:24.880767 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:24.880590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-dcfc455bf-p2k78" event={"ID":"faf72846-cb03-4e4c-986f-947dea5fb0fb","Type":"ContainerStarted","Data":"cb084f2f95e404b1b07e01325c2008eb4a393a1b3496afd5ce841af7e3dceee3"} Apr 28 19:20:32.974505 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:32.974454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:20:32.976775 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:32.976752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2d2c94-3731-45d6-be85-14a1a081468a-metrics-certs\") pod \"network-metrics-daemon-ggtvc\" (UID: \"0a2d2c94-3731-45d6-be85-14a1a081468a\") " pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:20:33.266930 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:33.266835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wz9tz\"" Apr 28 19:20:33.275133 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:33.275107 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ggtvc" Apr 28 19:20:33.395246 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:33.395118 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ggtvc"] Apr 28 19:20:33.398042 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:20:33.398015 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a2d2c94_3731_45d6_be85_14a1a081468a.slice/crio-740a65f75e6da1bf0230b850463850739e202cf5a31b1e45268e65c8354561f9 WatchSource:0}: Error finding container 740a65f75e6da1bf0230b850463850739e202cf5a31b1e45268e65c8354561f9: Status 404 returned error can't find the container with id 740a65f75e6da1bf0230b850463850739e202cf5a31b1e45268e65c8354561f9 Apr 28 19:20:33.907359 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:33.907323 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ggtvc" event={"ID":"0a2d2c94-3731-45d6-be85-14a1a081468a","Type":"ContainerStarted","Data":"740a65f75e6da1bf0230b850463850739e202cf5a31b1e45268e65c8354561f9"} Apr 28 19:20:34.911592 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:34.911552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ggtvc" event={"ID":"0a2d2c94-3731-45d6-be85-14a1a081468a","Type":"ContainerStarted","Data":"525d5f709a9dacca0caa55dda60c8d87796e40a16e092480772676e103ec55ba"} Apr 28 19:20:34.911592 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:34.911590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ggtvc" event={"ID":"0a2d2c94-3731-45d6-be85-14a1a081468a","Type":"ContainerStarted","Data":"36d45539a1a1e5c46687c7de2b35f9f89a80c7ee017d04e6bd85d6666f745f0b"} Apr 28 19:20:34.932618 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:34.932569 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ggtvc" podStartSLOduration=253.038155335 podStartE2EDuration="4m13.932554027s" podCreationTimestamp="2026-04-28 19:16:21 +0000 UTC" firstStartedPulling="2026-04-28 19:20:33.400328384 +0000 UTC m=+252.827828980" lastFinishedPulling="2026-04-28 19:20:34.294727077 +0000 UTC m=+253.722227672" observedRunningTime="2026-04-28 19:20:34.931857637 +0000 UTC m=+254.359358254" watchObservedRunningTime="2026-04-28 19:20:34.932554027 +0000 UTC m=+254.360054643" Apr 28 19:20:38.994069 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:38.993984 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:39.012405 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:39.012376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:39.941834 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:39.941807 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 28 19:20:59.584837 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:20:59.584794 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7826r" podUID="f72b413b-cc77-44a9-9bab-db02d985886b" Apr 28 19:20:59.584837 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:20:59.584794 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wl2q4" podUID="e361f8d4-ac75-4139-b8c7-fada2556f305" Apr 28 19:20:59.983840 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:59.983810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:20:59.984026 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:20:59.983978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7826r" Apr 28 19:21:03.449473 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.449432 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:21:03.449866 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.449507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:21:03.451789 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.451756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72b413b-cc77-44a9-9bab-db02d985886b-metrics-tls\") pod \"dns-default-7826r\" (UID: \"f72b413b-cc77-44a9-9bab-db02d985886b\") " pod="openshift-dns/dns-default-7826r" Apr 28 19:21:03.451921 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.451902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e361f8d4-ac75-4139-b8c7-fada2556f305-cert\") pod \"ingress-canary-wl2q4\" (UID: \"e361f8d4-ac75-4139-b8c7-fada2556f305\") " pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:21:03.588754 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.588717 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6qdcw\"" Apr 28 19:21:03.589443 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.589424 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pzmft\"" Apr 28 19:21:03.595435 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.595417 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7826r" Apr 28 19:21:03.595489 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.595432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wl2q4" Apr 28 19:21:03.726009 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.725918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wl2q4"] Apr 28 19:21:03.729027 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:21:03.728988 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode361f8d4_ac75_4139_b8c7_fada2556f305.slice/crio-8ccd33fe3deed474b9e87fc3c236ebdf7d53c69a1eaa68abf1802ffb138c3241 WatchSource:0}: Error finding container 8ccd33fe3deed474b9e87fc3c236ebdf7d53c69a1eaa68abf1802ffb138c3241: Status 404 returned error can't find the container with id 8ccd33fe3deed474b9e87fc3c236ebdf7d53c69a1eaa68abf1802ffb138c3241 Apr 28 19:21:03.746671 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.746649 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7826r"] Apr 28 19:21:03.748321 ip-10-0-140-230 kubenswrapper[2572]: W0428 19:21:03.748300 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72b413b_cc77_44a9_9bab_db02d985886b.slice/crio-88461b3dca894acbc097211a19703994b38e29052ac5fc2498caec1057616382 WatchSource:0}: Error finding container 88461b3dca894acbc097211a19703994b38e29052ac5fc2498caec1057616382: Status 404 returned error can't find the container with id 88461b3dca894acbc097211a19703994b38e29052ac5fc2498caec1057616382 Apr 28 19:21:03.995644 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.995552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wl2q4" event={"ID":"e361f8d4-ac75-4139-b8c7-fada2556f305","Type":"ContainerStarted","Data":"8ccd33fe3deed474b9e87fc3c236ebdf7d53c69a1eaa68abf1802ffb138c3241"} Apr 28 19:21:03.996584 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:03.996564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7826r" event={"ID":"f72b413b-cc77-44a9-9bab-db02d985886b","Type":"ContainerStarted","Data":"88461b3dca894acbc097211a19703994b38e29052ac5fc2498caec1057616382"} Apr 28 19:21:06.007088 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.007046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wl2q4" event={"ID":"e361f8d4-ac75-4139-b8c7-fada2556f305","Type":"ContainerStarted","Data":"1539366c42dc151ef90e5042682ae932625c8a5ca8e34ad6657b8f9f85a003fc"} Apr 28 19:21:06.008777 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.008750 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7826r" event={"ID":"f72b413b-cc77-44a9-9bab-db02d985886b","Type":"ContainerStarted","Data":"848c680b44d1c37c8cd5addba008d0d22ac6a86b92c0eb6e237dd6d52650054b"} Apr 28 19:21:06.008777 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.008778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7826r" event={"ID":"f72b413b-cc77-44a9-9bab-db02d985886b","Type":"ContainerStarted","Data":"1d5d40edd2537d4f40bece6354a4aaf7d69e83ee38ee675fce7de6334d987269"} Apr 28 19:21:06.008904 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.008847 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7826r" Apr 28 19:21:06.026022 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.025970 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wl2q4" podStartSLOduration=251.161165635 podStartE2EDuration="4m13.025956935s" podCreationTimestamp="2026-04-28 19:16:53 +0000 UTC" firstStartedPulling="2026-04-28 19:21:03.730899039 +0000 UTC m=+283.158399635" lastFinishedPulling="2026-04-28 19:21:05.59569034 +0000 UTC m=+285.023190935" observedRunningTime="2026-04-28 19:21:06.024771071 +0000 UTC m=+285.452271688" watchObservedRunningTime="2026-04-28 19:21:06.025956935 +0000 UTC m=+285.453457604" Apr 28 19:21:06.043631 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:06.043570 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7826r" podStartSLOduration=251.19548723 podStartE2EDuration="4m13.043548899s" podCreationTimestamp="2026-04-28 19:16:53 +0000 UTC" firstStartedPulling="2026-04-28 19:21:03.750108589 +0000 UTC m=+283.177609184" lastFinishedPulling="2026-04-28 19:21:05.598170258 +0000 UTC m=+285.025670853" observedRunningTime="2026-04-28 19:21:06.042079928 +0000 UTC m=+285.469580545" watchObservedRunningTime="2026-04-28 19:21:06.043548899 +0000 UTC m=+285.471049517" Apr 28 19:21:16.014104 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:16.014069 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7826r" Apr 28 19:21:21.059573 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:21.059547 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:21:21.474745 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:21.474706 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:21:46.496219 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.496144 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c458bfb7c-x754d" podUID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" containerName="console" containerID="cri-o://bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327" gracePeriod=15 Apr 28 19:21:46.727575 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.727553 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c458bfb7c-x754d_755fc6b7-9049-4c46-a58d-f874ccbd4f1e/console/0.log" Apr 28 19:21:46.727709 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.727627 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:21:46.799519 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799434 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799519 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799475 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799519 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799508 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799808 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799553 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799808 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799583 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w52q7\" (UniqueName: \"kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799808 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799679 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.799808 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799736 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle\") pod \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\" (UID: \"755fc6b7-9049-4c46-a58d-f874ccbd4f1e\") " Apr 28 19:21:46.800034 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.799992 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:46.800091 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.800038 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config" (OuterVolumeSpecName: "console-config") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:46.800091 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.800008 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca" (OuterVolumeSpecName: "service-ca") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:46.800368 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.800342 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:21:46.801801 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.801782 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:46.802082 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.802065 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:21:46.802143 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.802076 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7" (OuterVolumeSpecName: "kube-api-access-w52q7") pod "755fc6b7-9049-4c46-a58d-f874ccbd4f1e" (UID: "755fc6b7-9049-4c46-a58d-f874ccbd4f1e"). InnerVolumeSpecName "kube-api-access-w52q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:46.900628 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900585 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-service-ca\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900628 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900621 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w52q7\" (UniqueName: \"kubernetes.io/projected/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-kube-api-access-w52q7\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900628 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900633 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-config\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900875 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900646 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-trusted-ca-bundle\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900875 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900659 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-oauth-config\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900875 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900671 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-console-serving-cert\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:46.900875 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:46.900683 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/755fc6b7-9049-4c46-a58d-f874ccbd4f1e-oauth-serving-cert\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 19:21:47.123567 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123541 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c458bfb7c-x754d_755fc6b7-9049-4c46-a58d-f874ccbd4f1e/console/0.log" Apr 28 19:21:47.123748 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123577 2572 generic.go:358] "Generic (PLEG): container finished" podID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" containerID="bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327" exitCode=2 Apr 28 19:21:47.123748 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c458bfb7c-x754d" event={"ID":"755fc6b7-9049-4c46-a58d-f874ccbd4f1e","Type":"ContainerDied","Data":"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327"} Apr 28 19:21:47.123748 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c458bfb7c-x754d" event={"ID":"755fc6b7-9049-4c46-a58d-f874ccbd4f1e","Type":"ContainerDied","Data":"df8646ef7ab21637d71d64ffd6fac45cbe5097e1fb5377a619dfa9af9f4e7279"} Apr 28 19:21:47.123748 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123644 2572 scope.go:117] "RemoveContainer" containerID="bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327" Apr 28 19:21:47.123748 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.123644 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c458bfb7c-x754d" Apr 28 19:21:47.132767 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.132747 2572 scope.go:117] "RemoveContainer" containerID="bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327" Apr 28 19:21:47.133043 ip-10-0-140-230 kubenswrapper[2572]: E0428 19:21:47.133021 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327\": container with ID starting with bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327 not found: ID does not exist" containerID="bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327" Apr 28 19:21:47.133133 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.133048 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327"} err="failed to get container status \"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327\": rpc error: code = NotFound desc = could not find container \"bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327\": container with ID starting with bcc63ac378295665f31fd514f1be16c58ec992efec4a3802f478d13f27a2b327 not found: ID does not exist" Apr 28 19:21:47.147549 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.147514 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:21:47.154367 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.154341 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c458bfb7c-x754d"] Apr 28 19:21:47.167125 ip-10-0-140-230 kubenswrapper[2572]: I0428 19:21:47.167097 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" path="/var/lib/kubelet/pods/755fc6b7-9049-4c46-a58d-f874ccbd4f1e/volumes" Apr 28 20:08:12.522242 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.522192 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vzpjv/must-gather-x9rtb"] Apr 28 20:08:12.524274 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.522514 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" containerName="console" Apr 28 20:08:12.524274 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.522527 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" containerName="console" Apr 28 20:08:12.524274 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.522590 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="755fc6b7-9049-4c46-a58d-f874ccbd4f1e" containerName="console" Apr 28 20:08:12.525160 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.525144 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.528046 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.528023 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzpjv\"/\"kube-root-ca.crt\"" Apr 28 20:08:12.529031 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.529009 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vzpjv\"/\"default-dockercfg-jhnhg\"" Apr 28 20:08:12.529122 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.529090 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vzpjv\"/\"openshift-service-ca.crt\"" Apr 28 20:08:12.544412 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.544384 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzpjv/must-gather-x9rtb"] Apr 28 20:08:12.618074 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.618032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s77n\" (UniqueName: \"kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.618074 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.618076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.719215 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.719174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s77n\" (UniqueName: \"kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.719421 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.719247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.719600 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.719582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.727924 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.727901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s77n\" (UniqueName: \"kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n\") pod \"must-gather-x9rtb\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.834327 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.834191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:12.950286 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.950254 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vzpjv/must-gather-x9rtb"] Apr 28 20:08:12.953835 ip-10-0-140-230 kubenswrapper[2572]: W0428 20:08:12.953804 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb79095bc_6ce1_4f08_9cdd_226b5fdf93de.slice/crio-9e8467bd20fd67cc286105fa47bd20c10c3ac69649d21c89308696c09fa15638 WatchSource:0}: Error finding container 9e8467bd20fd67cc286105fa47bd20c10c3ac69649d21c89308696c09fa15638: Status 404 returned error can't find the container with id 9e8467bd20fd67cc286105fa47bd20c10c3ac69649d21c89308696c09fa15638 Apr 28 20:08:12.955476 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:12.955461 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 20:08:13.945169 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:13.945128 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" event={"ID":"b79095bc-6ce1-4f08-9cdd-226b5fdf93de","Type":"ContainerStarted","Data":"9e8467bd20fd67cc286105fa47bd20c10c3ac69649d21c89308696c09fa15638"} Apr 28 20:08:18.963597 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:18.963550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" event={"ID":"b79095bc-6ce1-4f08-9cdd-226b5fdf93de","Type":"ContainerStarted","Data":"95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03"} Apr 28 20:08:18.963597 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:18.963603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" event={"ID":"b79095bc-6ce1-4f08-9cdd-226b5fdf93de","Type":"ContainerStarted","Data":"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9"} Apr 28 20:08:18.981810 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:18.981749 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" podStartSLOduration=1.658564731 podStartE2EDuration="6.981732935s" podCreationTimestamp="2026-04-28 20:08:12 +0000 UTC" firstStartedPulling="2026-04-28 20:08:12.955582847 +0000 UTC m=+3112.383083442" lastFinishedPulling="2026-04-28 20:08:18.278751048 +0000 UTC m=+3117.706251646" observedRunningTime="2026-04-28 20:08:18.981436712 +0000 UTC m=+3118.408937329" watchObservedRunningTime="2026-04-28 20:08:18.981732935 +0000 UTC m=+3118.409233552" Apr 28 20:08:39.034074 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:39.034032 2572 generic.go:358] "Generic (PLEG): container finished" podID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerID="fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9" exitCode=0 Apr 28 20:08:39.034615 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:39.034110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" event={"ID":"b79095bc-6ce1-4f08-9cdd-226b5fdf93de","Type":"ContainerDied","Data":"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9"} Apr 28 20:08:39.034615 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:39.034551 2572 scope.go:117] "RemoveContainer" containerID="fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9" Apr 28 20:08:39.931748 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:39.931691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzpjv_must-gather-x9rtb_b79095bc-6ce1-4f08-9cdd-226b5fdf93de/gather/0.log" Apr 28 20:08:43.421428 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:43.421392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fvgjn_f4fd0bdf-e317-4621-b4cf-c41c8e666b62/global-pull-secret-syncer/0.log" Apr 28 20:08:43.599391 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:43.599354 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pkjxs_f8cecbc4-1ccf-49b8-bd5f-a126bd910b04/konnectivity-agent/0.log" Apr 28 20:08:43.722998 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:43.722904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-230.ec2.internal_77fe4b4b44e63783dd24b8c6f6bea437/haproxy/0.log" Apr 28 20:08:45.375760 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.375720 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vzpjv/must-gather-x9rtb"] Apr 28 20:08:45.376155 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.375935 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="copy" containerID="cri-o://95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03" gracePeriod=2 Apr 28 20:08:45.383549 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.383502 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vzpjv/must-gather-x9rtb"] Apr 28 20:08:45.599453 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.599432 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzpjv_must-gather-x9rtb_b79095bc-6ce1-4f08-9cdd-226b5fdf93de/copy/0.log" Apr 28 20:08:45.599784 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.599768 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:45.601885 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.601860 2572 status_manager.go:895] "Failed to get status for pod" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" err="pods \"must-gather-x9rtb\" is forbidden: User \"system:node:ip-10-0-140-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vzpjv\": no relationship found between node 'ip-10-0-140-230.ec2.internal' and this object" Apr 28 20:08:45.709311 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.709234 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output\") pod \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " Apr 28 20:08:45.709311 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.709301 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s77n\" (UniqueName: \"kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n\") pod \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\" (UID: \"b79095bc-6ce1-4f08-9cdd-226b5fdf93de\") " Apr 28 20:08:45.710808 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.710784 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b79095bc-6ce1-4f08-9cdd-226b5fdf93de" (UID: "b79095bc-6ce1-4f08-9cdd-226b5fdf93de"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 20:08:45.711508 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.711481 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n" (OuterVolumeSpecName: "kube-api-access-6s77n") pod "b79095bc-6ce1-4f08-9cdd-226b5fdf93de" (UID: "b79095bc-6ce1-4f08-9cdd-226b5fdf93de"). InnerVolumeSpecName "kube-api-access-6s77n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 20:08:45.810218 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.810162 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-must-gather-output\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 20:08:45.810397 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:45.810234 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6s77n\" (UniqueName: \"kubernetes.io/projected/b79095bc-6ce1-4f08-9cdd-226b5fdf93de-kube-api-access-6s77n\") on node \"ip-10-0-140-230.ec2.internal\" DevicePath \"\"" Apr 28 20:08:46.053858 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.053786 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vzpjv_must-gather-x9rtb_b79095bc-6ce1-4f08-9cdd-226b5fdf93de/copy/0.log" Apr 28 20:08:46.054103 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.054081 2572 generic.go:358] "Generic (PLEG): container finished" podID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerID="95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03" exitCode=143 Apr 28 20:08:46.054156 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.054131 2572 scope.go:117] "RemoveContainer" containerID="95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03" Apr 28 20:08:46.054156 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.054145 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" Apr 28 20:08:46.056401 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.056372 2572 status_manager.go:895] "Failed to get status for pod" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" err="pods \"must-gather-x9rtb\" is forbidden: User \"system:node:ip-10-0-140-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vzpjv\": no relationship found between node 'ip-10-0-140-230.ec2.internal' and this object" Apr 28 20:08:46.062104 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.062081 2572 scope.go:117] "RemoveContainer" containerID="fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9" Apr 28 20:08:46.064272 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.064246 2572 status_manager.go:895] "Failed to get status for pod" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" pod="openshift-must-gather-vzpjv/must-gather-x9rtb" err="pods \"must-gather-x9rtb\" is forbidden: User \"system:node:ip-10-0-140-230.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vzpjv\": no relationship found between node 'ip-10-0-140-230.ec2.internal' and this object" Apr 28 20:08:46.073884 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.073865 2572 scope.go:117] "RemoveContainer" containerID="95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03" Apr 28 20:08:46.074109 ip-10-0-140-230 kubenswrapper[2572]: E0428 20:08:46.074090 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03\": container with ID starting with 95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03 not found: ID does not exist" containerID="95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03" Apr 28 20:08:46.074164 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.074117 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03"} err="failed to get container status \"95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03\": rpc error: code = NotFound desc = could not find container \"95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03\": container with ID starting with 95adb926eeb7d338f0a560d50d29197ea8a55dfbda703ce9a46ef46534274b03 not found: ID does not exist" Apr 28 20:08:46.074164 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.074132 2572 scope.go:117] "RemoveContainer" containerID="fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9" Apr 28 20:08:46.074332 ip-10-0-140-230 kubenswrapper[2572]: E0428 20:08:46.074313 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9\": container with ID starting with fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9 not found: ID does not exist" containerID="fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9" Apr 28 20:08:46.074377 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:46.074339 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9"} err="failed to get container status \"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9\": rpc error: code = NotFound desc = could not find container \"fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9\": container with ID starting with fdf42547fd59aa1fb3fc084ee18c5ea0723445eb2142a2dc41777796d6906db9 not found: ID does not exist" Apr 28 20:08:47.167087 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.167042 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" path="/var/lib/kubelet/pods/b79095bc-6ce1-4f08-9cdd-226b5fdf93de/volumes" Apr 28 20:08:47.416005 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.415976 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-746l7_1b52ffce-09df-44f4-8b49-23d1c579a285/node-exporter/0.log" Apr 28 20:08:47.453587 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.453505 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-746l7_1b52ffce-09df-44f4-8b49-23d1c579a285/kube-rbac-proxy/0.log" Apr 28 20:08:47.495893 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.495870 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-746l7_1b52ffce-09df-44f4-8b49-23d1c579a285/init-textfile/0.log" Apr 28 20:08:47.796679 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.796602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kwqdl_ee133af6-93fa-4948-8dec-4878fa4c34b4/kube-rbac-proxy-main/0.log" Apr 28 20:08:47.823734 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.823710 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kwqdl_ee133af6-93fa-4948-8dec-4878fa4c34b4/kube-rbac-proxy-self/0.log" Apr 28 20:08:47.857236 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.857048 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kwqdl_ee133af6-93fa-4948-8dec-4878fa4c34b4/openshift-state-metrics/0.log" Apr 28 20:08:47.908681 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.908644 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/prometheus/0.log" Apr 28 20:08:47.929616 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.929585 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/config-reloader/0.log" Apr 28 20:08:47.954869 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.954842 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/thanos-sidecar/0.log" Apr 28 20:08:47.990449 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:47.990421 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/kube-rbac-proxy-web/0.log" Apr 28 20:08:48.020992 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.020959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/kube-rbac-proxy/0.log" Apr 28 20:08:48.048375 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.048291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/kube-rbac-proxy-thanos/0.log" Apr 28 20:08:48.073166 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.073137 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_314de588-071e-4fe6-a2cc-62cae8f6e9a7/init-config-reloader/0.log" Apr 28 20:08:48.111772 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.111745 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kw5gw_872a84d5-7590-4826-b51a-d7ad968aa650/prometheus-operator/0.log" Apr 28 20:08:48.135043 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.135001 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kw5gw_872a84d5-7590-4826-b51a-d7ad968aa650/kube-rbac-proxy/0.log" Apr 28 20:08:48.166286 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.166249 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-v56c9_323061f0-0771-44a0-8824-36b3911ce376/prometheus-operator-admission-webhook/0.log" Apr 28 20:08:48.310564 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.310537 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/thanos-query/0.log" Apr 28 20:08:48.340389 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.340361 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/kube-rbac-proxy-web/0.log" Apr 28 20:08:48.370034 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.370006 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/kube-rbac-proxy/0.log" Apr 28 20:08:48.399906 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.399885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/prom-label-proxy/0.log" Apr 28 20:08:48.431852 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.431826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/kube-rbac-proxy-rules/0.log" Apr 28 20:08:48.459827 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:48.459791 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545c89fc46-7w7rl_1dac9353-9a3d-43e4-9922-67b6fec37c8e/kube-rbac-proxy-metrics/0.log" Apr 28 20:08:50.351516 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351485 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7"] Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351784 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="copy" Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351795 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="copy" Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351814 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="gather" Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351819 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="gather" Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351860 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="gather" Apr 28 20:08:50.351917 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.351869 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b79095bc-6ce1-4f08-9cdd-226b5fdf93de" containerName="copy" Apr 28 20:08:50.355941 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.355917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.359293 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.359271 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"openshift-service-ca.crt\"" Apr 28 20:08:50.359405 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.359274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t6fg8\"/\"default-dockercfg-494nn\"" Apr 28 20:08:50.360219 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.360188 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t6fg8\"/\"kube-root-ca.crt\"" Apr 28 20:08:50.365813 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.365793 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7"] Apr 28 20:08:50.445705 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.445662 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-proc\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.445886 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.445711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-podres\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.445886 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.445766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-lib-modules\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.445886 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.445804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-sys\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.445886 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.445873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbblh\" (UniqueName: \"kubernetes.io/projected/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-kube-api-access-bbblh\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547089 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-podres\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547089 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-lib-modules\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547285 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-sys\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547285 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-podres\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547285 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-sys\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547285 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-lib-modules\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547410 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbblh\" (UniqueName: \"kubernetes.io/projected/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-kube-api-access-bbblh\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547410 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-proc\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.547471 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.547447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-proc\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.554964 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.554942 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbblh\" (UniqueName: \"kubernetes.io/projected/7f9f3c69-8c26-4e01-bb6d-71586d1191f4-kube-api-access-bbblh\") pod \"perf-node-gather-daemonset-f5fk7\" (UID: \"7f9f3c69-8c26-4e01-bb6d-71586d1191f4\") " pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.665353 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.665303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:50.785291 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:50.785250 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7"] Apr 28 20:08:50.788143 ip-10-0-140-230 kubenswrapper[2572]: W0428 20:08:50.788110 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7f9f3c69_8c26_4e01_bb6d_71586d1191f4.slice/crio-93043cc8362dd3348c51e0d74145a25deeeb02fcf0c9dc7a70012b3a01921729 WatchSource:0}: Error finding container 93043cc8362dd3348c51e0d74145a25deeeb02fcf0c9dc7a70012b3a01921729: Status 404 returned error can't find the container with id 93043cc8362dd3348c51e0d74145a25deeeb02fcf0c9dc7a70012b3a01921729 Apr 28 20:08:51.070681 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.070586 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" event={"ID":"7f9f3c69-8c26-4e01-bb6d-71586d1191f4","Type":"ContainerStarted","Data":"3cf547e1a38e380540945cff270bfe2980fb8d1ce1163c564a7983edbd6becb0"} Apr 28 20:08:51.070681 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.070631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" event={"ID":"7f9f3c69-8c26-4e01-bb6d-71586d1191f4","Type":"ContainerStarted","Data":"93043cc8362dd3348c51e0d74145a25deeeb02fcf0c9dc7a70012b3a01921729"} Apr 28 20:08:51.070958 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.070849 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:08:51.090715 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.090674 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" podStartSLOduration=1.090660724 podStartE2EDuration="1.090660724s" podCreationTimestamp="2026-04-28 20:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 20:08:51.089474295 +0000 UTC m=+3150.516974913" watchObservedRunningTime="2026-04-28 20:08:51.090660724 +0000 UTC m=+3150.518161341" Apr 28 20:08:51.380466 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.380436 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7826r_f72b413b-cc77-44a9-9bab-db02d985886b/dns/0.log" Apr 28 20:08:51.403454 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.403424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7826r_f72b413b-cc77-44a9-9bab-db02d985886b/kube-rbac-proxy/0.log" Apr 28 20:08:51.642090 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:51.642009 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9rm7p_eccda02a-118f-4d0d-858f-6d03050f92de/dns-node-resolver/0.log" Apr 28 20:08:52.169818 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:52.169788 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4clws_f5dbdaee-eba8-485f-88bf-e781ddd5de4d/node-ca/0.log" Apr 28 20:08:53.391510 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:53.391479 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wl2q4_e361f8d4-ac75-4139-b8c7-fada2556f305/serve-healthcheck-canary/0.log" Apr 28 20:08:53.929655 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:53.929599 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtt2h_5b4cd600-56e7-44e5-a9fb-8d0f722d19ec/kube-rbac-proxy/0.log" Apr 28 20:08:53.955569 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:53.955541 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtt2h_5b4cd600-56e7-44e5-a9fb-8d0f722d19ec/exporter/0.log" Apr 28 20:08:53.979264 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:53.979235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gtt2h_5b4cd600-56e7-44e5-a9fb-8d0f722d19ec/extractor/0.log" Apr 28 20:08:57.082932 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:08:57.082897 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-t6fg8/perf-node-gather-daemonset-f5fk7" Apr 28 20:09:01.094120 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:01.094082 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jnh4q_2fa50d25-bdbc-446f-a7b9-de3696866c07/kube-storage-version-migrator-operator/1.log" Apr 28 20:09:01.095539 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:01.095508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jnh4q_2fa50d25-bdbc-446f-a7b9-de3696866c07/kube-storage-version-migrator-operator/0.log" Apr 28 20:09:02.281798 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.281773 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/kube-multus-additional-cni-plugins/0.log" Apr 28 20:09:02.306505 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.306471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/egress-router-binary-copy/0.log" Apr 28 20:09:02.332408 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.332378 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/cni-plugins/0.log" Apr 28 20:09:02.358944 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.358914 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/bond-cni-plugin/0.log" Apr 28 20:09:02.383432 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.383405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/routeoverride-cni/0.log" Apr 28 20:09:02.407419 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.407395 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/whereabouts-cni-bincopy/0.log" Apr 28 20:09:02.436586 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.436561 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sbv79_9136f542-e29a-475e-9ef4-a5653b964224/whereabouts-cni/0.log" Apr 28 20:09:02.651879 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.651839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h56jw_375f0483-eb31-462b-859e-b59ffc509ba2/kube-multus/0.log" Apr 28 20:09:02.807309 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.807268 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ggtvc_0a2d2c94-3731-45d6-be85-14a1a081468a/network-metrics-daemon/0.log" Apr 28 20:09:02.832343 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:02.832315 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ggtvc_0a2d2c94-3731-45d6-be85-14a1a081468a/kube-rbac-proxy/0.log" Apr 28 20:09:04.372733 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.372697 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/ovn-controller/0.log" Apr 28 20:09:04.424294 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.424259 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/ovn-acl-logging/0.log" Apr 28 20:09:04.450076 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.450043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/kube-rbac-proxy-node/0.log" Apr 28 20:09:04.475586 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.475554 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 20:09:04.495289 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.495259 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/northd/0.log" Apr 28 20:09:04.520510 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.520483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/nbdb/0.log" Apr 28 20:09:04.545973 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.545949 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/sbdb/0.log" Apr 28 20:09:04.711126 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:04.711043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wv7qh_4b96f8c8-4344-49c6-8d8e-fdd7b4cf9a11/ovnkube-controller/0.log" Apr 28 20:09:05.730703 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:05.730673 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t8qc5_d7e741f6-720c-42da-b861-0d9702bff94d/network-check-target-container/0.log" Apr 28 20:09:06.696277 ip-10-0-140-230 kubenswrapper[2572]: I0428 20:09:06.696242 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vlrjx_37f93754-2c70-4bb8-bd31-698cb86a7f0d/iptables-alerter/0.log"