Apr 22 15:08:16.344948 ip-10-0-141-246 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:08:16.799902 ip-10-0-141-246 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.799902 ip-10-0-141-246 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:08:16.799902 ip-10-0-141-246 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.799902 ip-10-0-141-246 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:08:16.799902 ip-10-0-141-246 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.801448 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.801360 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:08:16.805151 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805137 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.805151 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805151 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805155 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805159 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805162 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805165 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805168 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805171 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805174 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805177 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805180 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805188 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805191 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805195 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805197 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805200 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805203 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805205 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805208 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805212 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.805212 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805216 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805219 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805222 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805225 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805229 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805233 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805236 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805240 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805242 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805245 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805247 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805250 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805253 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805255 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805258 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805261 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805263 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805266 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805268 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.805760 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805270 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805273 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805275 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805278 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805280 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805283 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805285 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805289 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805291 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805293 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805296 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805298 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805302 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805304 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805307 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805310 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805313 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805316 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805319 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805321 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.806225 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805324 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805326 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805329 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805331 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805334 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805336 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805339 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805341 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805344 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805347 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805349 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805353 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805355 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805358 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805361 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805363 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805366 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805368 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805371 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805374 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.806719 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805377 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805379 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805382 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805384 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805387 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805389 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.805392 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806324 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806331 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806335 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806338 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806341 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806344 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806347 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806351 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806355 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806358 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806361 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806363 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.807205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806366 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806369 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806371 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806374 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806376 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806379 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806381 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806384 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806387 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806389 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806392 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806395 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806398 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806400 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806403 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806406 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806408 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806411 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806413 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806416 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.807653 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806420 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806423 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806425 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806428 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806431 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806433 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806436 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806438 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806441 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806443 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806446 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806448 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806451 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806454 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806457 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806461 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806464 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806467 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806469 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.808194 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806472 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806474 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806477 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806479 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806482 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806485 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806487 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806490 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806493 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806495 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806498 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806501 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806503 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806507 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806510 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806512 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806515 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806517 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806520 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806523 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.808663 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806526 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806528 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806530 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806533 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806536 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806538 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806540 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806543 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806545 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806548 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806550 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806553 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806555 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806558 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.806560 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807768 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807779 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807787 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807794 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807798 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807802 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:08:16.809174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807806 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807812 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807815 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807818 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807822 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807825 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807828 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807831 2569 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807834 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807837 2569 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807840 2569 flags.go:64] FLAG: --cloud-config="" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807843 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807846 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807850 2569 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807853 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807856 2569 flags.go:64] FLAG: --config-dir="" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807859 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807862 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807866 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807870 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807873 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807876 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807880 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807883 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:08:16.809698 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807885 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807889 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807892 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807896 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807900 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807902 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807909 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807912 2569 flags.go:64] FLAG: --enable-server="true" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807915 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807919 2569 flags.go:64] FLAG: --event-burst="100" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807922 2569 flags.go:64] FLAG: --event-qps="50" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807926 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807929 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807932 2569 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807936 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807939 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807942 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807945 2569 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807948 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807951 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807954 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807957 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807960 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807963 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807965 2569 flags.go:64] FLAG: --feature-gates="" Apr 22 15:08:16.810279 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807969 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807972 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807976 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807979 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807982 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807985 2569 flags.go:64] FLAG: --help="false" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807988 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-141-246.ec2.internal" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807990 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807993 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.807997 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808001 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808004 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808007 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808010 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808013 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808016 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808019 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808022 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808025 2569 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808029 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808032 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808035 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808038 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808041 2569 flags.go:64] FLAG: --lock-file="" Apr 22 15:08:16.810883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808044 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808047 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808050 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808055 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808058 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808061 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808064 2569 flags.go:64] FLAG: --logging-format="text" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808067 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808070 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808073 2569 flags.go:64] FLAG: --manifest-url="" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808076 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808080 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808083 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808087 2569 flags.go:64] FLAG: --max-pods="110" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808090 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808093 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808096 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808101 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808104 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808107 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808110 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808117 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808121 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808124 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:08:16.811469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808127 2569 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808131 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808137 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808140 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808143 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808146 2569 flags.go:64] FLAG: --port="10250" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808149 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808152 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ee13e0e2e8f84a3f" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808155 2569 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808158 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808161 2569 flags.go:64] FLAG: --register-node="true" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808164 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808167 2569 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808176 2569 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808179 2569 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808181 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808184 2569 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808188 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808192 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808195 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808198 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808201 2569 flags.go:64] FLAG: --runonce="false" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808204 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808207 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808210 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:08:16.812058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808213 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808217 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808220 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808224 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808227 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808230 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808233 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808236 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808239 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808243 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808245 2569 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808248 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808253 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808256 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808259 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808263 2569 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808265 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808268 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808271 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808274 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808277 2569 flags.go:64] FLAG: --v="2" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808282 2569 flags.go:64] FLAG: --version="false" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808286 2569 flags.go:64] FLAG: --vmodule="" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808290 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.808293 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:08:16.812639 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808392 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808396 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808399 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808402 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808405 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808407 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808410 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808412 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808417 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808420 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808423 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808425 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808428 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808430 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808433 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808436 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808438 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808441 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808444 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808446 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.813264 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808449 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808451 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808454 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808456 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808459 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808461 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808464 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808466 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808469 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808472 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808474 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808477 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808479 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808499 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808503 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808506 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808509 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808512 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808515 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808518 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.813769 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808522 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808525 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808527 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808530 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808532 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808535 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808538 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808540 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808543 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808546 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808548 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808551 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808554 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808558 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808561 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808564 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808567 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808569 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808572 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.814269 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808575 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808577 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808580 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808582 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808585 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808587 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808590 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808592 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808595 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808597 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808599 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808602 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808605 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808608 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808611 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808614 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808616 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808619 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808622 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808624 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.814859 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808627 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808629 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808632 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808635 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808639 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808642 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.808645 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.815660 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.809367 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.817217 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.817094 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:08:16.817217 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.817217 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817296 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817304 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817311 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817318 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817323 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817327 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817331 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817335 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817341 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817347 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817352 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817356 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817360 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817365 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817369 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817374 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817377 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817381 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.817380 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817386 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817390 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817394 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817398 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817402 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817407 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817411 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817415 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817420 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817424 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817429 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817434 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817438 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817442 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817447 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817452 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817456 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817460 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817464 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817468 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.818235 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817472 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817476 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817480 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817484 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817488 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817492 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817496 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817500 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817504 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817508 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817512 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817516 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817520 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817525 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817530 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817535 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817539 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817544 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817548 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817552 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.818950 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817556 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817561 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817565 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817569 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817573 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817577 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817581 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817586 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817591 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817595 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817599 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817603 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817607 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817611 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817616 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817620 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817623 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817627 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817633 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817637 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.819513 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817642 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817646 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817650 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817654 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817658 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817662 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817667 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817671 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.817698 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817858 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817868 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817873 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817879 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817884 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817889 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817893 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.820205 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817898 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817902 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817906 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817911 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817915 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817919 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817923 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817928 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817932 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817936 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817940 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817944 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817949 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817953 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817957 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817963 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817967 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817971 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817975 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817979 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.820905 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817983 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817988 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817992 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.817996 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818001 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818005 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818009 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818013 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818017 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818022 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818026 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818030 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818034 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818038 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818042 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818046 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818050 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818054 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818058 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818062 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.821638 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818066 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818071 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818075 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818112 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818121 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818127 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818132 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818137 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818143 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818147 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818152 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818156 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818160 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818164 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818169 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818173 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818178 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818182 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818186 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.822275 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818191 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818195 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818200 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818204 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818208 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818212 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818216 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818221 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818225 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818229 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818233 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818236 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818241 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818245 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818249 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818253 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818257 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818263 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818268 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.822770 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:16.818272 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.818280 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.819119 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.822041 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.823056 2569 server.go:1019] "Starting client certificate rotation" Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.823155 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:16.823284 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.823192 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:16.848789 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.848771 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:16.852884 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.852868 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:16.869960 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.869940 2569 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:08:16.876881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.876863 2569 log.go:25] "Validated CRI v1 image API" Apr 22 15:08:16.878206 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.878182 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:08:16.882210 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.882191 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a80c6552-c592-4a81-9919-fb939f01cf4b:/dev/nvme0n1p4 b54c7e02-ad7c-4257-af1e-48e63ffd0ce9:/dev/nvme0n1p3] Apr 22 15:08:16.882280 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.882210 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:08:16.885933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.885916 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:16.887725 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.887607 2569 manager.go:217] Machine: {Timestamp:2026-04-22 15:08:16.885826642 +0000 UTC m=+0.422998346 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104532 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec238dc2f9acd0823b1ca6813865b7d7 SystemUUID:ec238dc2-f9ac-d082-3b1c-a6813865b7d7 BootID:284f66d2-d0f3-413d-af6e-aa163f624a84 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8b:dd:0e:2b:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8b:dd:0e:2b:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:f0:db:1f:c6:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:08:16.887725 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.887718 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:08:16.887829 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.887797 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:08:16.888759 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.888735 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:08:16.888916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.888777 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-246.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:08:16.888967 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.888926 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:08:16.888967 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.888935 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:08:16.888967 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.888948 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:16.889588 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.889578 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:16.891147 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.891136 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:16.891259 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.891250 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:08:16.893864 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.893855 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:08:16.893906 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.893867 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:08:16.893906 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.893879 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:08:16.893906 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.893888 2569 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:08:16.893906 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.893896 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:08:16.895082 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.895070 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:16.895129 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.895088 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:16.897746 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.897728 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:08:16.899273 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.899260 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:08:16.901319 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901300 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:08:16.901319 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901317 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:08:16.901319 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901323 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901328 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901334 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901341 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901348 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901353 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901359 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901365 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901382 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:08:16.901489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.901391 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:08:16.902237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.902227 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:08:16.902237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.902238 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:08:16.905748 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.905732 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:08:16.905825 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.905767 2569 server.go:1295] "Started kubelet" Apr 22 15:08:16.905879 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.905829 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:08:16.905936 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.905894 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:08:16.905995 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.905950 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:08:16.906514 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.906466 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-246.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:08:16.906555 ip-10-0-141-246 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:08:16.907176 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.906957 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-246.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:08:16.907176 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.907095 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:08:16.907468 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.907336 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:08:16.909058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.909045 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:08:16.912432 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.911572 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-246.ec2.internal.18a8b64d7bf05123 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-246.ec2.internal,UID:ip-10-0-141-246.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-246.ec2.internal,},FirstTimestamp:2026-04-22 15:08:16.905744675 +0000 UTC m=+0.442916380,LastTimestamp:2026-04-22 15:08:16.905744675 +0000 UTC m=+0.442916380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-246.ec2.internal,}" Apr 22 15:08:16.913880 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.913858 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:16.914409 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.914393 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:08:16.915182 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915164 2569 factory.go:55] Registering systemd factory Apr 22 15:08:16.915276 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915227 2569 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:08:16.915405 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.915384 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:16.915541 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915523 2569 factory.go:153] Registering CRI-O factory Apr 22 15:08:16.915610 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915545 2569 factory.go:223] Registration of the crio container factory successfully Apr 22 15:08:16.915610 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915494 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:08:16.915723 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915616 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:08:16.915723 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915471 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:08:16.915723 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915593 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:08:16.915723 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915723 2569 factory.go:103] Registering Raw factory Apr 22 15:08:16.915924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915731 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:08:16.915924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915737 2569 manager.go:1196] Started watching for new ooms in manager Apr 22 15:08:16.915924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.915739 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:08:16.916967 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.916947 2569 manager.go:319] Starting recovery of all containers Apr 22 15:08:16.918705 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.918661 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:08:16.919955 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.919746 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:08:16.928533 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.928515 2569 manager.go:324] Recovery completed Apr 22 15:08:16.929607 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.929583 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-246.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:08:16.929891 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.929865 2569 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 15:08:16.932709 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.932697 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.934154 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.934139 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hxx47" Apr 22 15:08:16.936047 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936029 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.936127 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936063 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.936127 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936077 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.936602 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936586 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:08:16.936602 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936600 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:08:16.936720 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.936620 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:16.939040 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.939028 2569 policy_none.go:49] "None policy: Start" Apr 22 15:08:16.939094 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.939044 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:08:16.939094 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.939054 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:08:16.945003 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.944989 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hxx47" Apr 22 15:08:16.948386 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.948321 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-246.ec2.internal.18a8b64d7dbeb3b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-246.ec2.internal,UID:ip-10-0-141-246.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-246.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-246.ec2.internal,},FirstTimestamp:2026-04-22 15:08:16.936047538 +0000 UTC m=+0.473219250,LastTimestamp:2026-04-22 15:08:16.936047538 +0000 UTC m=+0.473219250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-246.ec2.internal,}" Apr 22 15:08:16.975274 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975257 2569 manager.go:341] "Starting Device Plugin manager" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.975291 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975302 2569 server.go:85] "Starting device plugin registration server" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975522 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975535 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975630 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975714 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:16.975720 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.976475 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:08:16.983048 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:16.976506 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.051450 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.051377 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:08:17.052598 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.052582 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:08:17.052654 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.052609 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:08:17.052654 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.052626 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:08:17.052654 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.052632 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:08:17.052802 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.052721 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:08:17.055610 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.055591 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:17.075979 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.075953 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:17.076795 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.076780 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:17.076868 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.076809 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:17.076868 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.076819 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:17.076868 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.076839 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.083831 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.083811 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.083910 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.083833 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-246.ec2.internal\": node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.104331 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.104311 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.153354 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.153323 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal"] Apr 22 15:08:17.153489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.153406 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:17.154269 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.154250 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:17.154368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.154289 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:17.154368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.154304 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:17.155655 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.155640 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:17.155813 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.155797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.155870 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.155827 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:17.156300 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156286 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:17.156368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156310 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:17.156368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156324 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:17.156368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156331 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:17.156368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156348 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:17.156368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.156359 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:17.157557 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.157544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.157640 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.157569 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:17.158155 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.158137 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:17.158227 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.158161 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:17.158227 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.158171 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:17.181287 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.181271 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-246.ec2.internal\" not found" node="ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.185543 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.185528 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-246.ec2.internal\" not found" node="ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.204568 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.204552 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.217200 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.217174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d3eeb1bf222524c1163d0f72b3f74e11-config\") pod \"kube-apiserver-proxy-ip-10-0-141-246.ec2.internal\" (UID: \"d3eeb1bf222524c1163d0f72b3f74e11\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.217300 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.217208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.217300 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.217225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.305347 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.305284 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.318243 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d3eeb1bf222524c1163d0f72b3f74e11-config\") pod \"kube-apiserver-proxy-ip-10-0-141-246.ec2.internal\" (UID: \"d3eeb1bf222524c1163d0f72b3f74e11\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.318306 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.318306 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.318373 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.318373 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d3eeb1bf222524c1163d0f72b3f74e11-config\") pod \"kube-apiserver-proxy-ip-10-0-141-246.ec2.internal\" (UID: \"d3eeb1bf222524c1163d0f72b3f74e11\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.318373 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.318357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99a1765eee2014b16836b13b3ff3a7f3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal\" (UID: \"99a1765eee2014b16836b13b3ff3a7f3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.405523 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.405486 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.483058 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.483029 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.488046 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.488025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:17.505726 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.505703 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.606297 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.606274 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.706864 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.706837 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.773567 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.773545 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:17.807263 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.807242 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.814604 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.814586 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:17.823198 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.823184 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:08:17.823306 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.823288 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:17.823344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.823332 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:17.823378 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.823339 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:17.907348 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:17.907318 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:17.914159 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.914139 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:17.924838 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.924816 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:17.948070 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.948029 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:03:16 +0000 UTC" deadline="2027-12-13 11:01:15.339810913 +0000 UTC" Apr 22 15:08:17.948171 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.948072 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14395h52m57.391744474s" Apr 22 15:08:17.954536 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.954517 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tt59f" Apr 22 15:08:17.962451 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:17.962429 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tt59f" Apr 22 15:08:17.999049 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:17.998878 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3eeb1bf222524c1163d0f72b3f74e11.slice/crio-4c9ecd0f13625053c758a685eb4c25a7a21f5528753842bf043be3a89d854634 WatchSource:0}: Error finding container 4c9ecd0f13625053c758a685eb4c25a7a21f5528753842bf043be3a89d854634: Status 404 returned error can't find the container with id 4c9ecd0f13625053c758a685eb4c25a7a21f5528753842bf043be3a89d854634 Apr 22 15:08:17.999352 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:17.999334 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a1765eee2014b16836b13b3ff3a7f3.slice/crio-a736ab8f97c1118d07fa3b64c6b31fc81ae4f9221412b108414cc36757430426 WatchSource:0}: Error finding container a736ab8f97c1118d07fa3b64c6b31fc81ae4f9221412b108414cc36757430426: Status 404 returned error can't find the container with id a736ab8f97c1118d07fa3b64c6b31fc81ae4f9221412b108414cc36757430426 Apr 22 15:08:18.003482 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.003469 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:08:18.007981 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:18.007966 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-246.ec2.internal\" not found" Apr 22 15:08:18.031955 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.031935 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:18.055774 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.055733 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" event={"ID":"99a1765eee2014b16836b13b3ff3a7f3","Type":"ContainerStarted","Data":"a736ab8f97c1118d07fa3b64c6b31fc81ae4f9221412b108414cc36757430426"} Apr 22 15:08:18.056722 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.056701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" event={"ID":"d3eeb1bf222524c1163d0f72b3f74e11","Type":"ContainerStarted","Data":"4c9ecd0f13625053c758a685eb4c25a7a21f5528753842bf043be3a89d854634"} Apr 22 15:08:18.114708 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.114677 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" Apr 22 15:08:18.124174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.124157 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:18.124994 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.124980 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" Apr 22 15:08:18.134690 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.134669 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:18.894803 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.894776 2569 apiserver.go:52] "Watching apiserver" Apr 22 15:08:18.903418 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.903393 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:08:18.906311 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.906282 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hzm72","openshift-network-diagnostics/network-check-target-wfpb4","openshift-network-operator/iptables-alerter-5qlng","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr","openshift-dns/node-resolver-2bfnk","openshift-multus/multus-additional-cni-plugins-qpxv8","openshift-multus/multus-dgt4k","openshift-ovn-kubernetes/ovnkube-node-cq6md","kube-system/konnectivity-agent-z8m2f","kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal","openshift-cluster-node-tuning-operator/tuned-ckl4r","openshift-image-registry/node-ca-9tmhk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal"] Apr 22 15:08:18.909603 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.909578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.910853 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.910810 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:18.911014 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:18.910892 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:18.911014 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.910907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:18.912092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.912339 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912285 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:08:18.912426 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912347 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.912478 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912438 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.912650 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912630 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:08:18.912809 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.912785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-k5vfw\"" Apr 22 15:08:18.913367 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.913341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:18.913453 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.913390 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.913734 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.913714 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.913830 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.913758 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:08:18.914361 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.914331 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kzl22\"" Apr 22 15:08:18.914468 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.914453 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:08:18.915347 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.915323 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.915447 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.915431 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.915503 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.915471 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6mf7c\"" Apr 22 15:08:18.916383 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.915978 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.916383 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.916278 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.916729 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.916663 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vrq7v\"" Apr 22 15:08:18.917646 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.917624 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.917752 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.917697 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:18.917814 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:18.917779 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:18.920043 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.920025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:18.920352 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.920335 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:08:18.920529 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.920510 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:08:18.920593 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.920521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qjp2l\"" Apr 22 15:08:18.921641 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.921395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:18.922649 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.922631 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:08:18.922760 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.922633 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4dwgj\"" Apr 22 15:08:18.922760 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.922697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:08:18.923152 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.923135 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:18.924212 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924194 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:08:18.924326 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924219 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:08:18.924326 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924255 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:08:18.924326 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924322 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.924529 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924462 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7qqm6\"" Apr 22 15:08:18.924529 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924482 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:08:18.924653 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924537 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.924766 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.924748 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:18.925470 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.925445 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zdh5s\"" Apr 22 15:08:18.925552 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.925494 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.925901 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.925884 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.926642 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-multus\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.926748 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926691 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:18.926793 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppw58\" (UniqueName: \"kubernetes.io/projected/a59df43f-31be-4ae9-be3b-b10d9d017c59-kube-api-access-ppw58\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:18.926793 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-conf-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.926924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.926924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-sys-fs\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.926924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-system-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.926924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.926924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-os-release\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-daemon-config\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.926976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwkbj\" (UniqueName: \"kubernetes.io/projected/b42b90f9-6d5b-4342-979c-184c4620abb7-kube-api-access-gwkbj\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927001 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkpn\" (UniqueName: \"kubernetes.io/projected/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kube-api-access-9gkpn\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgcz9\" (UniqueName: \"kubernetes.io/projected/9d868504-055f-463c-b932-801175d669c7-kube-api-access-wgcz9\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-cnibin\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927074 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt66j\" (UniqueName: \"kubernetes.io/projected/49c61d54-9288-4695-8a61-293644b9038e-kube-api-access-xt66j\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-cni-binary-copy\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-netns\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-hostroot\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927269 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj42t\" (UniqueName: \"kubernetes.io/projected/543bfa0b-0351-4992-892b-fe4be8b7eb4c-kube-api-access-lj42t\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927302 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-multus-certs\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-etc-kubernetes\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927350 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927392 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-cnibin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-socket-dir-parent\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927434 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-bin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a59df43f-31be-4ae9-be3b-b10d9d017c59-host-slash\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-socket-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927574 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543bfa0b-0351-4992-892b-fe4be8b7eb4c-hosts-file\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-k8s-cni-cncf-io\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-kubelet\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-registration-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-os-release\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927861 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a59df43f-31be-4ae9-be3b-b10d9d017c59-iptables-alerter-script\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:18.927916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927909 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-device-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543bfa0b-0351-4992-892b-fe4be8b7eb4c-tmp-dir\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.927957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-system-cni-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.928026 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.928035 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.928105 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rrf8w\"" Apr 22 15:08:18.928501 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.928107 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.963423 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.963391 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:17 +0000 UTC" deadline="2027-10-14 11:21:52.984029385 +0000 UTC" Apr 22 15:08:18.963529 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:18.963423 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12956h13m34.020610487s" Apr 22 15:08:19.017320 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.017203 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:08:19.028429 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-log-socket\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.028575 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-os-release\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.028575 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.028575 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a59df43f-31be-4ae9-be3b-b10d9d017c59-iptables-alerter-script\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-lib-modules\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028621 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-os-release\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-system-cni-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:19.028742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppw58\" (UniqueName: \"kubernetes.io/projected/a59df43f-31be-4ae9-be3b-b10d9d017c59-kube-api-access-ppw58\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-sys-fs\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-system-cni-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028787 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-conf\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028814 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-system-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-os-release\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwkbj\" (UniqueName: \"kubernetes.io/projected/b42b90f9-6d5b-4342-979c-184c4620abb7-kube-api-access-gwkbj\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-sys-fs\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkpn\" (UniqueName: \"kubernetes.io/projected/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kube-api-access-9gkpn\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-etc-tuned\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-system-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvgn\" (UniqueName: \"kubernetes.io/projected/c25715bc-fd04-4b79-b6da-892713f85b6c-kube-api-access-7gvgn\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.028974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-bin\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-os-release\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029022 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-cnibin\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-cni-binary-copy\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a59df43f-31be-4ae9-be3b-b10d9d017c59-iptables-alerter-script\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029107 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-netns\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029129 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-run\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/49c61d54-9288-4695-8a61-293644b9038e-cnibin\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029178 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-ovn\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-kubernetes\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-sys\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-netns\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029275 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-var-lib-kubelet\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c25715bc-fd04-4b79-b6da-892713f85b6c-serviceca\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-kubelet\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029384 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-netd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-bin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.029735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a59df43f-31be-4ae9-be3b-b10d9d017c59-host-slash\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029461 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543bfa0b-0351-4992-892b-fe4be8b7eb4c-hosts-file\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029487 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-bin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-slash\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-k8s-cni-cncf-io\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a59df43f-31be-4ae9-be3b-b10d9d017c59-host-slash\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029550 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-kubelet\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543bfa0b-0351-4992-892b-fe4be8b7eb4c-hosts-file\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-k8s-cni-cncf-io\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-kubelet\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-registration-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029670 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-registration-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029722 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-cni-binary-copy\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-device-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.030456 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029774 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-device-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543bfa0b-0351-4992-892b-fe4be8b7eb4c-tmp-dir\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-netns\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.029800 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.029853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-multus\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030029 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-var-lib-cni-multus\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030093 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543bfa0b-0351-4992-892b-fe4be8b7eb4c-tmp-dir\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.030114 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:19.530085166 +0000 UTC m=+3.067256857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-tmp\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnk4\" (UniqueName: \"kubernetes.io/projected/e6729570-df4a-4419-b11f-1b1f782967bd-kube-api-access-6dnk4\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030197 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-var-lib-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030233 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-conf-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-conf-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031136 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-daemon-config\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030408 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-etc-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-cni-dir\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030435 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-config\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030479 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-env-overrides\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a686392c-d33c-438a-ba47-40397c16e97c-ovn-node-metrics-cert\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-script-lib\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgcz9\" (UniqueName: \"kubernetes.io/projected/9d868504-055f-463c-b932-801175d669c7-kube-api-access-wgcz9\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt66j\" (UniqueName: \"kubernetes.io/projected/49c61d54-9288-4695-8a61-293644b9038e-kube-api-access-xt66j\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030586 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-hostroot\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030612 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj42t\" (UniqueName: \"kubernetes.io/projected/543bfa0b-0351-4992-892b-fe4be8b7eb4c-kube-api-access-lj42t\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030635 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-systemd\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-systemd-units\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030693 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46hv\" (UniqueName: \"kubernetes.io/projected/a686392c-d33c-438a-ba47-40397c16e97c-kube-api-access-t46hv\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.031886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030721 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-multus-certs\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030820 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-daemon-config\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-host-run-multus-certs\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-hostroot\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-etc-kubernetes\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-konnectivity-ca\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030919 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-host\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-systemd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-etc-kubernetes\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.030977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031013 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-cnibin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-socket-dir-parent\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-socket-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-cnibin\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031131 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b42b90f9-6d5b-4342-979c-184c4620abb7-multus-socket-dir-parent\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-agent-certs\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031202 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c25715bc-fd04-4b79-b6da-892713f85b6c-host\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-socket-dir\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.032589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-node-log\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.033318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-modprobe-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.033318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysconfig\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.033318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.033318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.031394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/49c61d54-9288-4695-8a61-293644b9038e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.036795 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.036774 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:08:19.039477 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.039456 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:19.039477 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.039477 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:19.039621 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.039490 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.039621 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.039580 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:19.539561987 +0000 UTC m=+3.076733679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.040332 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.040311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwkbj\" (UniqueName: \"kubernetes.io/projected/b42b90f9-6d5b-4342-979c-184c4620abb7-kube-api-access-gwkbj\") pod \"multus-dgt4k\" (UID: \"b42b90f9-6d5b-4342-979c-184c4620abb7\") " pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.041591 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.041567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppw58\" (UniqueName: \"kubernetes.io/projected/a59df43f-31be-4ae9-be3b-b10d9d017c59-kube-api-access-ppw58\") pod \"iptables-alerter-5qlng\" (UID: \"a59df43f-31be-4ae9-be3b-b10d9d017c59\") " pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.043853 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.043829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgcz9\" (UniqueName: \"kubernetes.io/projected/9d868504-055f-463c-b932-801175d669c7-kube-api-access-wgcz9\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:19.043933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.043891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt66j\" (UniqueName: \"kubernetes.io/projected/49c61d54-9288-4695-8a61-293644b9038e-kube-api-access-xt66j\") pod \"multus-additional-cni-plugins-qpxv8\" (UID: \"49c61d54-9288-4695-8a61-293644b9038e\") " pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.044521 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.044500 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkpn\" (UniqueName: \"kubernetes.io/projected/d9dcdab4-58d9-45fa-a9c1-d04589ab8abe-kube-api-access-9gkpn\") pod \"aws-ebs-csi-driver-node-pvchr\" (UID: \"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.048620 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.048600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj42t\" (UniqueName: \"kubernetes.io/projected/543bfa0b-0351-4992-892b-fe4be8b7eb4c-kube-api-access-lj42t\") pod \"node-resolver-2bfnk\" (UID: \"543bfa0b-0351-4992-892b-fe4be8b7eb4c\") " pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.131863 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-conf\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.131863 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-etc-tuned\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvgn\" (UniqueName: \"kubernetes.io/projected/c25715bc-fd04-4b79-b6da-892713f85b6c-kube-api-access-7gvgn\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-bin\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131950 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-run\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-ovn\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-kubernetes\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.131994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-bin\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-run\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-conf\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-sys\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-kubernetes\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-var-lib-kubelet\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-sys\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132097 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c25715bc-fd04-4b79-b6da-892713f85b6c-serviceca\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-ovn\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-kubelet\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-kubelet\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-netd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-var-lib-kubelet\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-cni-netd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132218 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-slash\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132248 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-slash\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-netns\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-tmp\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dnk4\" (UniqueName: \"kubernetes.io/projected/e6729570-df4a-4419-b11f-1b1f782967bd-kube-api-access-6dnk4\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-var-lib-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132409 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-var-lib-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-netns\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.132700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-etc-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c25715bc-fd04-4b79-b6da-892713f85b6c-serviceca\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-etc-openvswitch\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-host-run-ovn-kubernetes\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-config\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132600 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-env-overrides\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132622 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a686392c-d33c-438a-ba47-40397c16e97c-ovn-node-metrics-cert\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-script-lib\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-systemd\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-systemd-units\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t46hv\" (UniqueName: \"kubernetes.io/projected/a686392c-d33c-438a-ba47-40397c16e97c-kube-api-access-t46hv\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-konnectivity-ca\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-host\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-systemd\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132873 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-host\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-systemd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-systemd-units\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-agent-certs\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.132981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c25715bc-fd04-4b79-b6da-892713f85b6c-host\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133005 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-node-log\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-modprobe-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-run-systemd\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-env-overrides\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysconfig\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-log-socket\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-lib-modules\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-lib-modules\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysconfig\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133467 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-modprobe-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-konnectivity-ca\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e6729570-df4a-4419-b11f-1b1f782967bd-etc-sysctl-d\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.133882 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-log-socket\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.134497 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-script-lib\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.134497 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c25715bc-fd04-4b79-b6da-892713f85b6c-host\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.134497 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a686392c-d33c-438a-ba47-40397c16e97c-node-log\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.134497 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.133996 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a686392c-d33c-438a-ba47-40397c16e97c-ovnkube-config\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.134737 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.134716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-etc-tuned\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.135128 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.135101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6729570-df4a-4419-b11f-1b1f782967bd-tmp\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.135864 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.135832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0af3cc76-ae15-4ad4-b096-3b3dacd3e370-agent-certs\") pod \"konnectivity-agent-z8m2f\" (UID: \"0af3cc76-ae15-4ad4-b096-3b3dacd3e370\") " pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.136237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.136217 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a686392c-d33c-438a-ba47-40397c16e97c-ovn-node-metrics-cert\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.145183 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.145132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvgn\" (UniqueName: \"kubernetes.io/projected/c25715bc-fd04-4b79-b6da-892713f85b6c-kube-api-access-7gvgn\") pod \"node-ca-9tmhk\" (UID: \"c25715bc-fd04-4b79-b6da-892713f85b6c\") " pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.145255 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.145231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46hv\" (UniqueName: \"kubernetes.io/projected/a686392c-d33c-438a-ba47-40397c16e97c-kube-api-access-t46hv\") pod \"ovnkube-node-cq6md\" (UID: \"a686392c-d33c-438a-ba47-40397c16e97c\") " pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.145378 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.145351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dnk4\" (UniqueName: \"kubernetes.io/projected/e6729570-df4a-4419-b11f-1b1f782967bd-kube-api-access-6dnk4\") pod \"tuned-ckl4r\" (UID: \"e6729570-df4a-4419-b11f-1b1f782967bd\") " pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.224181 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.224140 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgt4k" Apr 22 15:08:19.231671 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.231650 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5qlng" Apr 22 15:08:19.239472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.239447 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" Apr 22 15:08:19.244089 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.244063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2bfnk" Apr 22 15:08:19.250548 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.250530 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" Apr 22 15:08:19.258173 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.258148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:19.265724 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.265700 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:19.271289 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.271268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" Apr 22 15:08:19.276752 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.276733 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tmhk" Apr 22 15:08:19.304165 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.304142 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:19.501960 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.501882 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:19.537297 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.537265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:19.537441 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.537399 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.537491 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.537459 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:20.53744486 +0000 UTC m=+4.074616569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.637610 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.637576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:19.637799 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.637775 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:19.637862 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.637802 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:19.637862 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.637814 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.637964 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:19.637873 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:20.63785339 +0000 UTC m=+4.175025096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.675322 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.675305 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda686392c_d33c_438a_ba47_40397c16e97c.slice/crio-941da21c7334997a42c8bc8f710f4efc3d55cd44f8a1d57342753093347cf65a WatchSource:0}: Error finding container 941da21c7334997a42c8bc8f710f4efc3d55cd44f8a1d57342753093347cf65a: Status 404 returned error can't find the container with id 941da21c7334997a42c8bc8f710f4efc3d55cd44f8a1d57342753093347cf65a Apr 22 15:08:19.676604 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.676581 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda59df43f_31be_4ae9_be3b_b10d9d017c59.slice/crio-6428d1672030707dcf3a06f79d3fb7727f554a8fcfb2ea29c0ec3c0b014bd94b WatchSource:0}: Error finding container 6428d1672030707dcf3a06f79d3fb7727f554a8fcfb2ea29c0ec3c0b014bd94b: Status 404 returned error can't find the container with id 6428d1672030707dcf3a06f79d3fb7727f554a8fcfb2ea29c0ec3c0b014bd94b Apr 22 15:08:19.678430 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.678405 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af3cc76_ae15_4ad4_b096_3b3dacd3e370.slice/crio-48fa9a3ab1e65f23ea8e6d20dae59e3a053bcc58126c2821eeef910515600113 WatchSource:0}: Error finding container 48fa9a3ab1e65f23ea8e6d20dae59e3a053bcc58126c2821eeef910515600113: Status 404 returned error can't find the container with id 48fa9a3ab1e65f23ea8e6d20dae59e3a053bcc58126c2821eeef910515600113 Apr 22 15:08:19.681005 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.680977 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49c61d54_9288_4695_8a61_293644b9038e.slice/crio-7adb7805a6848233b30cac3acf8e0eada9dbafc25dd9c86b8d71962de7fdf913 WatchSource:0}: Error finding container 7adb7805a6848233b30cac3acf8e0eada9dbafc25dd9c86b8d71962de7fdf913: Status 404 returned error can't find the container with id 7adb7805a6848233b30cac3acf8e0eada9dbafc25dd9c86b8d71962de7fdf913 Apr 22 15:08:19.681923 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.681899 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543bfa0b_0351_4992_892b_fe4be8b7eb4c.slice/crio-26ba3102af68b9752eea3c7c692803e5648da75f6badc1a8177d64779cc4ae76 WatchSource:0}: Error finding container 26ba3102af68b9752eea3c7c692803e5648da75f6badc1a8177d64779cc4ae76: Status 404 returned error can't find the container with id 26ba3102af68b9752eea3c7c692803e5648da75f6badc1a8177d64779cc4ae76 Apr 22 15:08:19.687276 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:19.686955 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dcdab4_58d9_45fa_a9c1_d04589ab8abe.slice/crio-8fdfcee1d45c5306c141e07e2f71b9052f96fd4c7236432c0ffcf3f824b1c3dc WatchSource:0}: Error finding container 8fdfcee1d45c5306c141e07e2f71b9052f96fd4c7236432c0ffcf3f824b1c3dc: Status 404 returned error can't find the container with id 8fdfcee1d45c5306c141e07e2f71b9052f96fd4c7236432c0ffcf3f824b1c3dc Apr 22 15:08:19.809721 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.809696 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:19.963728 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.963670 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:17 +0000 UTC" deadline="2028-01-29 19:35:42.276518972 +0000 UTC" Apr 22 15:08:19.963728 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:19.963722 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15532h27m22.312800226s" Apr 22 15:08:20.053939 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.053855 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:20.054088 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.053992 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:20.061383 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.061215 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgt4k" event={"ID":"b42b90f9-6d5b-4342-979c-184c4620abb7","Type":"ContainerStarted","Data":"9519b58d0a4c84e3fc10073c0a0b0ffebad3ccf71915044608d696c8f5638b11"} Apr 22 15:08:20.062901 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.062870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5qlng" event={"ID":"a59df43f-31be-4ae9-be3b-b10d9d017c59","Type":"ContainerStarted","Data":"6428d1672030707dcf3a06f79d3fb7727f554a8fcfb2ea29c0ec3c0b014bd94b"} Apr 22 15:08:20.064351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.064324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"941da21c7334997a42c8bc8f710f4efc3d55cd44f8a1d57342753093347cf65a"} Apr 22 15:08:20.065636 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.065612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" event={"ID":"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe","Type":"ContainerStarted","Data":"8fdfcee1d45c5306c141e07e2f71b9052f96fd4c7236432c0ffcf3f824b1c3dc"} Apr 22 15:08:20.066810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.066784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tmhk" event={"ID":"c25715bc-fd04-4b79-b6da-892713f85b6c","Type":"ContainerStarted","Data":"88fdf3961c66e60eada82903ebc834fb6d6726f4b18339e4394044b05374f2dc"} Apr 22 15:08:20.067990 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.067967 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2bfnk" event={"ID":"543bfa0b-0351-4992-892b-fe4be8b7eb4c","Type":"ContainerStarted","Data":"26ba3102af68b9752eea3c7c692803e5648da75f6badc1a8177d64779cc4ae76"} Apr 22 15:08:20.069811 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.069778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerStarted","Data":"7adb7805a6848233b30cac3acf8e0eada9dbafc25dd9c86b8d71962de7fdf913"} Apr 22 15:08:20.070911 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.070875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z8m2f" event={"ID":"0af3cc76-ae15-4ad4-b096-3b3dacd3e370","Type":"ContainerStarted","Data":"48fa9a3ab1e65f23ea8e6d20dae59e3a053bcc58126c2821eeef910515600113"} Apr 22 15:08:20.073766 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.073713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" event={"ID":"d3eeb1bf222524c1163d0f72b3f74e11","Type":"ContainerStarted","Data":"127f5ac5020eb7c6f281dc4bed6151652bf64258f8e083e6b15467e96c6e8bc5"} Apr 22 15:08:20.076748 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.076713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" event={"ID":"e6729570-df4a-4419-b11f-1b1f782967bd","Type":"ContainerStarted","Data":"99885ecaaad0f148719db7135b218ab58dcf7e75163b019fb9c1c83ac9cf3d15"} Apr 22 15:08:20.089191 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.089139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-246.ec2.internal" podStartSLOduration=2.089123522 podStartE2EDuration="2.089123522s" podCreationTimestamp="2026-04-22 15:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:20.088923708 +0000 UTC m=+3.626095423" watchObservedRunningTime="2026-04-22 15:08:20.089123522 +0000 UTC m=+3.626295238" Apr 22 15:08:20.545941 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.545903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:20.546149 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.546094 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:20.546214 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.546162 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:22.546141327 +0000 UTC m=+6.083313036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:20.646947 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:20.646886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:20.647120 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.647101 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:20.647200 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.647127 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:20.647200 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.647139 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:20.647200 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:20.647198 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:22.647180385 +0000 UTC m=+6.184352081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:21.056330 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:21.055836 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:21.056330 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:21.055961 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:21.090901 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:21.090868 2569 generic.go:358] "Generic (PLEG): container finished" podID="99a1765eee2014b16836b13b3ff3a7f3" containerID="fb7c18d08ace901c65a11ab7aa453cc7677eee6afb3e8c95e339d7cd48a2b178" exitCode=0 Apr 22 15:08:21.091872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:21.091847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" event={"ID":"99a1765eee2014b16836b13b3ff3a7f3","Type":"ContainerDied","Data":"fb7c18d08ace901c65a11ab7aa453cc7677eee6afb3e8c95e339d7cd48a2b178"} Apr 22 15:08:22.053595 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:22.053560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:22.053837 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.053706 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:22.110472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:22.110437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" event={"ID":"99a1765eee2014b16836b13b3ff3a7f3","Type":"ContainerStarted","Data":"7eb5eba08a5021c752321d8e34ae84002bdd616c4ca4d672e4a4bbff61b591fc"} Apr 22 15:08:22.560738 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:22.560700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:22.560922 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.560858 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:22.560988 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.560927 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:26.560908332 +0000 UTC m=+10.098080026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:22.661695 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:22.660990 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:22.661695 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.661213 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:22.661695 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.661235 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:22.661695 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.661248 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:22.661695 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:22.661311 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:26.661289318 +0000 UTC m=+10.198461029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:23.055361 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:23.054857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:23.055361 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:23.055002 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:24.053282 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:24.053247 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:24.053766 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:24.053397 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:25.053553 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:25.053520 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:25.053970 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:25.053650 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:26.053746 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:26.053701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:26.054172 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.053910 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:26.594780 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:26.594739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:26.595037 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.594996 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:26.595115 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.595102 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:34.595080866 +0000 UTC m=+18.132252559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:26.695935 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:26.695868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:26.696111 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.696064 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:26.696111 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.696091 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:26.696111 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.696105 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:26.696270 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:26.696172 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:34.696154391 +0000 UTC m=+18.233326097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:27.054752 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:27.054656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:27.055176 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:27.054791 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:28.052881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:28.052841 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:28.053058 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:28.052976 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:29.053914 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:29.053833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:29.054339 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:29.053950 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:30.054015 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:30.053823 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:30.054403 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:30.054111 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:31.053921 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:31.053879 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:31.054095 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:31.054004 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:32.053345 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:32.053315 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:32.053498 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:32.053423 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:33.053386 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:33.053357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:33.053858 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:33.053472 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:34.052925 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:34.052894 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:34.053223 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.053031 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:34.656736 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:34.656694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:34.657179 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.656855 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:34.657179 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.656936 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.656916571 +0000 UTC m=+34.194088267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:34.758004 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:34.757968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:34.758167 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.758110 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:34.758167 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.758126 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:34.758167 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.758135 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:34.758291 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:34.758190 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.758169776 +0000 UTC m=+34.295341471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:35.053364 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:35.053283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:35.053506 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:35.053419 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:36.053467 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:36.053429 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:36.053949 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:36.053577 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:37.054369 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.054186 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:37.055335 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:37.054419 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:37.137990 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.137958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"4f34c45d81ca9f46b1da390c277f291f8ab94fcbed70f51d88dd43f9bd8c35c7"} Apr 22 15:08:37.138092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.137999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"1ee2d7fb816f50783212b02b5c42e675e23cea5b1d9e23c71c24353da0abce10"} Apr 22 15:08:37.138092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.138012 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"20c6624bb58eb712c40be01b0bc8264efce7f8b8201d60b6c828d091945006cc"} Apr 22 15:08:37.138092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.138024 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"ff5a0b139cd30a62bc84861411f8bb63bc6f7bd0a2fb242ffdfffd13b122b2d2"} Apr 22 15:08:37.138092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.138036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"d2c2280feb38e511d0e19098c33874057aacfe2ca909c1239608051c386171d2"} Apr 22 15:08:37.139051 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.139028 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" event={"ID":"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe","Type":"ContainerStarted","Data":"c133e9afb96dfe961fc3a41c09c2a16559dfb59c98591c95c714eec73f5218e5"} Apr 22 15:08:37.140195 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.140173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tmhk" event={"ID":"c25715bc-fd04-4b79-b6da-892713f85b6c","Type":"ContainerStarted","Data":"0bebac0b566e62ca53c0aa77997875fb6626b076d218197b71c4a93f0ade0e29"} Apr 22 15:08:37.141310 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.141290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2bfnk" event={"ID":"543bfa0b-0351-4992-892b-fe4be8b7eb4c","Type":"ContainerStarted","Data":"7f46766712947ba0953c6e0898794b39ac12fcf2298b561f00200031e103408f"} Apr 22 15:08:37.142599 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.142579 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="804b59bd8f778a3fe88900a1753d46dcabe6f34b897f8a98fb03f11f86a2b760" exitCode=0 Apr 22 15:08:37.142713 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.142635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"804b59bd8f778a3fe88900a1753d46dcabe6f34b897f8a98fb03f11f86a2b760"} Apr 22 15:08:37.143939 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.143877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z8m2f" event={"ID":"0af3cc76-ae15-4ad4-b096-3b3dacd3e370","Type":"ContainerStarted","Data":"86667aeb610db3e447695d8ae05fef27babebada4ae63e627710a5ef319cc5e3"} Apr 22 15:08:37.145235 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.145209 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" event={"ID":"e6729570-df4a-4419-b11f-1b1f782967bd","Type":"ContainerStarted","Data":"4928b721c0955b89ae362e5ce71420d65d954e343b2b4713b84eb856929b317e"} Apr 22 15:08:37.146435 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.146416 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgt4k" event={"ID":"b42b90f9-6d5b-4342-979c-184c4620abb7","Type":"ContainerStarted","Data":"0181e091475006e761a982828ef1df188305d7d91c96721fc5d5a5d3a9b2546d"} Apr 22 15:08:37.159349 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.159310 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-246.ec2.internal" podStartSLOduration=19.159279567 podStartE2EDuration="19.159279567s" podCreationTimestamp="2026-04-22 15:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:22.129770447 +0000 UTC m=+5.666942163" watchObservedRunningTime="2026-04-22 15:08:37.159279567 +0000 UTC m=+20.696451283" Apr 22 15:08:37.159976 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.159950 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9tmhk" podStartSLOduration=3.330292756 podStartE2EDuration="20.15994107s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.686076942 +0000 UTC m=+3.223248635" lastFinishedPulling="2026-04-22 15:08:36.515725251 +0000 UTC m=+20.052896949" observedRunningTime="2026-04-22 15:08:37.159279803 +0000 UTC m=+20.696451517" watchObservedRunningTime="2026-04-22 15:08:37.15994107 +0000 UTC m=+20.697112784" Apr 22 15:08:37.194383 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.194333 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dgt4k" podStartSLOduration=3.350449828 podStartE2EDuration="20.194319482s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.685295246 +0000 UTC m=+3.222466945" lastFinishedPulling="2026-04-22 15:08:36.529164893 +0000 UTC m=+20.066336599" observedRunningTime="2026-04-22 15:08:37.178076222 +0000 UTC m=+20.715247938" watchObservedRunningTime="2026-04-22 15:08:37.194319482 +0000 UTC m=+20.731491196" Apr 22 15:08:37.194647 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.194619 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z8m2f" podStartSLOduration=7.765948126 podStartE2EDuration="20.194611026s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.680564434 +0000 UTC m=+3.217736126" lastFinishedPulling="2026-04-22 15:08:32.109227327 +0000 UTC m=+15.646399026" observedRunningTime="2026-04-22 15:08:37.194297519 +0000 UTC m=+20.731469233" watchObservedRunningTime="2026-04-22 15:08:37.194611026 +0000 UTC m=+20.731782739" Apr 22 15:08:37.215559 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.215513 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2bfnk" podStartSLOduration=3.383768321 podStartE2EDuration="20.215498065s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.683888822 +0000 UTC m=+3.221060521" lastFinishedPulling="2026-04-22 15:08:36.51561857 +0000 UTC m=+20.052790265" observedRunningTime="2026-04-22 15:08:37.215311274 +0000 UTC m=+20.752482988" watchObservedRunningTime="2026-04-22 15:08:37.215498065 +0000 UTC m=+20.752669778" Apr 22 15:08:37.315603 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.315501 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ckl4r" podStartSLOduration=3.486051828 podStartE2EDuration="20.315482145s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.686336269 +0000 UTC m=+3.223507962" lastFinishedPulling="2026-04-22 15:08:36.515766574 +0000 UTC m=+20.052938279" observedRunningTime="2026-04-22 15:08:37.294338048 +0000 UTC m=+20.831509764" watchObservedRunningTime="2026-04-22 15:08:37.315482145 +0000 UTC m=+20.852653864" Apr 22 15:08:37.318110 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.318084 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:37.318721 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.318702 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:37.687344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.687319 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:08:37.989000 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.988902 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:08:37.687340595Z","UUID":"6e57af75-4bf3-4069-a19e-4fb5deed4416","Handler":null,"Name":"","Endpoint":""} Apr 22 15:08:37.991736 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.991715 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:08:37.991736 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:37.991741 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:08:38.053562 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.053534 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:38.053714 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:38.053638 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:38.149559 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.149527 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5qlng" event={"ID":"a59df43f-31be-4ae9-be3b-b10d9d017c59","Type":"ContainerStarted","Data":"857e5430a2eb599e6fe0c7e529aff0776c17edf3fc4ef13693c44fc181460691"} Apr 22 15:08:38.152715 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.152674 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"fb4c067c7f446839d577e16f2b1f1d6a742dc3ef3f93368b7ea56b5af85a45c3"} Apr 22 15:08:38.159060 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.156513 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" event={"ID":"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe","Type":"ContainerStarted","Data":"6f39de175ae33e8b37db38191f09e4e3bddec1de2b74b3820befb6a47c8ea098"} Apr 22 15:08:38.159060 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.158181 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:38.159060 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.158285 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z8m2f" Apr 22 15:08:38.194818 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:38.194765 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5qlng" podStartSLOduration=4.358895158 podStartE2EDuration="21.194750666s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.679870195 +0000 UTC m=+3.217041891" lastFinishedPulling="2026-04-22 15:08:36.515725695 +0000 UTC m=+20.052897399" observedRunningTime="2026-04-22 15:08:38.173438725 +0000 UTC m=+21.710610442" watchObservedRunningTime="2026-04-22 15:08:38.194750666 +0000 UTC m=+21.731922381" Apr 22 15:08:39.053878 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:39.053847 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:39.054092 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:39.053953 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:39.159193 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:39.159161 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" event={"ID":"d9dcdab4-58d9-45fa-a9c1-d04589ab8abe","Type":"ContainerStarted","Data":"e33c3d7c72dec3aca20dcb75cfb68be76b770fda8d128cf2c517cbea457c1246"} Apr 22 15:08:39.186506 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:39.186449 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvchr" podStartSLOduration=3.375644075 podStartE2EDuration="22.186431399s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.688460581 +0000 UTC m=+3.225632273" lastFinishedPulling="2026-04-22 15:08:38.49924789 +0000 UTC m=+22.036419597" observedRunningTime="2026-04-22 15:08:39.186208385 +0000 UTC m=+22.723380101" watchObservedRunningTime="2026-04-22 15:08:39.186431399 +0000 UTC m=+22.723603115" Apr 22 15:08:40.052871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:40.052840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:40.053024 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:40.052953 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:40.164970 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:40.164922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"cbe59af0794c5d8ece8fb5137713d9166b22da1979f087b8f31a59a98c2d95f3"} Apr 22 15:08:41.053438 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:41.053408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:41.053624 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:41.053536 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:42.053367 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.053339 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:42.053760 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:42.053441 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:42.169658 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.169584 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="6901a9c5f1ff4c511ca06104fc127dc6850196e7d49daee98cb51e50a84bac7b" exitCode=0 Apr 22 15:08:42.169789 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.169667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"6901a9c5f1ff4c511ca06104fc127dc6850196e7d49daee98cb51e50a84bac7b"} Apr 22 15:08:42.172981 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.172961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" event={"ID":"a686392c-d33c-438a-ba47-40397c16e97c","Type":"ContainerStarted","Data":"727b636121ef3ad71b85b48925154fd3f6d4d3933649d21bf1009ef4e53355c1"} Apr 22 15:08:42.173273 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.173255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:42.187509 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:42.187493 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:43.053762 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:43.053731 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:43.054248 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:43.053884 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:43.174901 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:43.174872 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:43.175274 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:43.175257 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:43.188835 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:43.188815 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:43.229318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:43.229279 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" podStartSLOduration=9.302926707 podStartE2EDuration="26.2292669s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.677293452 +0000 UTC m=+3.214465145" lastFinishedPulling="2026-04-22 15:08:36.603633639 +0000 UTC m=+20.140805338" observedRunningTime="2026-04-22 15:08:42.278745233 +0000 UTC m=+25.815916947" watchObservedRunningTime="2026-04-22 15:08:43.2292669 +0000 UTC m=+26.766438595" Apr 22 15:08:44.053625 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:44.053595 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:44.053829 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:44.053703 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:44.178459 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:44.178425 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="b73181ac7cd871a0b9aa4f299b123e0a3dd008a40a4598286af3c5933259474e" exitCode=0 Apr 22 15:08:44.178587 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:44.178512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"b73181ac7cd871a0b9aa4f299b123e0a3dd008a40a4598286af3c5933259474e"} Apr 22 15:08:44.178781 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:44.178768 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:45.053779 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:45.053596 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:45.053933 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:45.053847 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:45.180196 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:45.180171 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:46.052898 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:46.052869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:46.053079 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:46.053024 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:46.184195 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:46.184163 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="b0cd89ecb43d2713c8217128433be24b631ad199775c725e8871b87769eecc83" exitCode=0 Apr 22 15:08:46.184535 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:46.184219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"b0cd89ecb43d2713c8217128433be24b631ad199775c725e8871b87769eecc83"} Apr 22 15:08:47.054465 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:47.054432 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:47.054652 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:47.054527 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:48.052964 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:48.052921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:48.053378 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:48.053069 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:49.016921 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:49.016889 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:08:49.017200 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:49.017173 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:49.031609 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:49.031551 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" podUID="a686392c-d33c-438a-ba47-40397c16e97c" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 15:08:49.041027 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:49.040995 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" podUID="a686392c-d33c-438a-ba47-40397c16e97c" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 15:08:49.053758 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:49.053737 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:49.054073 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:49.053830 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:50.052872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.052842 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:50.053148 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.052977 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:50.673235 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.673200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:50.673949 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.673396 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:50.673949 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.673459 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs podName:9d868504-055f-463c-b932-801175d669c7 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.673439368 +0000 UTC m=+66.210611081 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs") pod "network-metrics-daemon-hzm72" (UID: "9d868504-055f-463c-b932-801175d669c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:50.774521 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.774486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:50.774699 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.774652 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:50.774699 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.774669 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:50.774813 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.774701 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xmkpt for pod openshift-network-diagnostics/network-check-target-wfpb4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:50.774813 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.774753 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt podName:987b673f-d105-40ae-8ec9-b8dab14f068f nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.774740164 +0000 UTC m=+66.311911856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmkpt" (UniqueName: "kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt") pod "network-check-target-wfpb4" (UID: "987b673f-d105-40ae-8ec9-b8dab14f068f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:50.975612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.975514 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wfpb4"] Apr 22 15:08:50.975790 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.975707 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:50.975850 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.975812 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:50.976253 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.976230 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hzm72"] Apr 22 15:08:50.976364 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:50.976349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:50.976472 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:50.976447 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:52.052866 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:52.052837 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:52.053453 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:52.052961 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:52.214463 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:52.214113 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerStarted","Data":"1664fcf3ce8eef9493c9c19d1b77e698fe19d6094f40510cf11ee04a56c4dda5"} Apr 22 15:08:53.052933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:53.052856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:53.053265 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:53.052972 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hzm72" podUID="9d868504-055f-463c-b932-801175d669c7" Apr 22 15:08:53.218002 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:53.217972 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="1664fcf3ce8eef9493c9c19d1b77e698fe19d6094f40510cf11ee04a56c4dda5" exitCode=0 Apr 22 15:08:53.218153 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:53.218038 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"1664fcf3ce8eef9493c9c19d1b77e698fe19d6094f40510cf11ee04a56c4dda5"} Apr 22 15:08:54.052919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.052885 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:54.053069 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:54.052986 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wfpb4" podUID="987b673f-d105-40ae-8ec9-b8dab14f068f" Apr 22 15:08:54.222717 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.222673 2569 generic.go:358] "Generic (PLEG): container finished" podID="49c61d54-9288-4695-8a61-293644b9038e" containerID="1171bccd74ff721845b97bc3ce55fd436b4f975a53362374568a58445f2a02a1" exitCode=0 Apr 22 15:08:54.222864 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.222720 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerDied","Data":"1171bccd74ff721845b97bc3ce55fd436b4f975a53362374568a58445f2a02a1"} Apr 22 15:08:54.271903 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.271879 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-246.ec2.internal" event="NodeReady" Apr 22 15:08:54.272109 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.272036 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:08:54.312588 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.312565 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cdb9c7cd9-75b2w"] Apr 22 15:08:54.322808 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.322790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.327079 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.327052 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:08:54.327079 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.327075 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:08:54.327569 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.327433 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:08:54.327569 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.327456 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jdzns\"" Apr 22 15:08:54.330172 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.330150 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cdb9c7cd9-75b2w"] Apr 22 15:08:54.353382 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.353360 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:08:54.356959 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.356940 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-76xbb"] Apr 22 15:08:54.376676 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.376653 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nrz6v"] Apr 22 15:08:54.376830 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.376805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.379697 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.379654 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:08:54.379956 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.379941 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:08:54.380846 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.380830 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62nkc\"" Apr 22 15:08:54.398080 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.398061 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76xbb"] Apr 22 15:08:54.398168 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.398086 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nrz6v"] Apr 22 15:08:54.398168 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.398166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.401459 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401440 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xqxh9\"" Apr 22 15:08:54.401561 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401489 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-installation-pull-secrets\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401561 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401535 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-registry-tls\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401668 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-image-registry-private-configuration\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401668 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-registry-certificates\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401668 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-trusted-ca\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401858 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-bound-sa-token\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401858 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkj7\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-kube-api-access-wwkj7\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401858 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9de5e468-cf8f-4904-9728-9f97bf669789-ca-trust-extracted\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.401858 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401809 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:08:54.401994 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.401893 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:08:54.402322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.402305 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:08:54.431464 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.431443 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hll6j"] Apr 22 15:08:54.453225 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.453206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.456916 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.456896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:08:54.457209 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.457193 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:08:54.457257 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.457211 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:08:54.457421 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.457406 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dmx2q\"" Apr 22 15:08:54.457500 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.457438 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:08:54.459455 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.459439 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hll6j"] Apr 22 15:08:54.502282 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-tmp-dir\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.502402 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9de5e468-cf8f-4904-9728-9f97bf669789-ca-trust-extracted\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502402 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-cert\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.502402 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-installation-pull-secrets\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502402 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4zc\" (UniqueName: \"kubernetes.io/projected/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-kube-api-access-9k4zc\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.502612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502432 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-metrics-tls\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.502612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-registry-tls\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-image-registry-private-configuration\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502592 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-registry-certificates\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9de5e468-cf8f-4904-9728-9f97bf669789-ca-trust-extracted\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-trusted-ca\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.502810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km7sr\" (UniqueName: \"kubernetes.io/projected/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-kube-api-access-km7sr\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.502810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-config-volume\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.502810 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-bound-sa-token\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.503082 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.502821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkj7\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-kube-api-access-wwkj7\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.503319 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.503300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-registry-certificates\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.503755 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.503727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9de5e468-cf8f-4904-9728-9f97bf669789-trusted-ca\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.506552 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.506533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-registry-tls\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.506692 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.506535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-installation-pull-secrets\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.506692 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.506572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9de5e468-cf8f-4904-9728-9f97bf669789-image-registry-private-configuration\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.512942 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.512915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-bound-sa-token\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.513039 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.513024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkj7\" (UniqueName: \"kubernetes.io/projected/9de5e468-cf8f-4904-9728-9f97bf669789-kube-api-access-wwkj7\") pod \"image-registry-5cdb9c7cd9-75b2w\" (UID: \"9de5e468-cf8f-4904-9728-9f97bf669789\") " pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.603763 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km7sr\" (UniqueName: \"kubernetes.io/projected/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-kube-api-access-km7sr\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.603919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-config-volume\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.603919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603801 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f36f114-7050-456f-a137-b827c10f5102-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.603919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603841 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f36f114-7050-456f-a137-b827c10f5102-crio-socket\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.603919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-tmp-dir\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.604102 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-cert\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.604102 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.603993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4zc\" (UniqueName: \"kubernetes.io/projected/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-kube-api-access-9k4zc\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.604189 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vctm8\" (UniqueName: \"kubernetes.io/projected/4f36f114-7050-456f-a137-b827c10f5102-kube-api-access-vctm8\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.604189 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f36f114-7050-456f-a137-b827c10f5102-data-volume\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.604189 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-metrics-tls\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.604320 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f36f114-7050-456f-a137-b827c10f5102-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.604320 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-tmp-dir\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.604454 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.604432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-config-volume\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.606191 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.606168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-metrics-tls\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.606280 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.606216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-cert\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.613113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.613089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km7sr\" (UniqueName: \"kubernetes.io/projected/f8e31d3f-d753-47a0-b9ff-ad56f2deb44a-kube-api-access-km7sr\") pod \"ingress-canary-nrz6v\" (UID: \"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a\") " pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.613113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.613107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4zc\" (UniqueName: \"kubernetes.io/projected/31b91dc6-451c-4694-8db8-6ef3dcecbf4c-kube-api-access-9k4zc\") pod \"dns-default-76xbb\" (UID: \"31b91dc6-451c-4694-8db8-6ef3dcecbf4c\") " pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.635125 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.635110 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:54.686186 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.686160 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:54.705075 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vctm8\" (UniqueName: \"kubernetes.io/projected/4f36f114-7050-456f-a137-b827c10f5102-kube-api-access-vctm8\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705208 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f36f114-7050-456f-a137-b827c10f5102-data-volume\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705208 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f36f114-7050-456f-a137-b827c10f5102-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705208 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705146 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f36f114-7050-456f-a137-b827c10f5102-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705361 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f36f114-7050-456f-a137-b827c10f5102-crio-socket\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f36f114-7050-456f-a137-b827c10f5102-crio-socket\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705624 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f36f114-7050-456f-a137-b827c10f5102-data-volume\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.705964 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.705949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f36f114-7050-456f-a137-b827c10f5102-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.706633 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.706621 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nrz6v" Apr 22 15:08:54.707847 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.707830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f36f114-7050-456f-a137-b827c10f5102-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.720867 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.720589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vctm8\" (UniqueName: \"kubernetes.io/projected/4f36f114-7050-456f-a137-b827c10f5102-kube-api-access-vctm8\") pod \"insights-runtime-extractor-hll6j\" (UID: \"4f36f114-7050-456f-a137-b827c10f5102\") " pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.762258 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.761247 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hll6j" Apr 22 15:08:54.836947 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.836887 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cdb9c7cd9-75b2w"] Apr 22 15:08:54.843729 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:54.843666 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de5e468_cf8f_4904_9728_9f97bf669789.slice/crio-92a534468b1e902e4822c20155b1792b689d4237b8b1b9218000f9b4cb9cc615 WatchSource:0}: Error finding container 92a534468b1e902e4822c20155b1792b689d4237b8b1b9218000f9b4cb9cc615: Status 404 returned error can't find the container with id 92a534468b1e902e4822c20155b1792b689d4237b8b1b9218000f9b4cb9cc615 Apr 22 15:08:54.856100 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.856051 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-76xbb"] Apr 22 15:08:54.860088 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:54.860041 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b91dc6_451c_4694_8db8_6ef3dcecbf4c.slice/crio-10678956b24a0300ea21b1fafa823afb612b1c8c434e7dcaa013a1cf0c44c489 WatchSource:0}: Error finding container 10678956b24a0300ea21b1fafa823afb612b1c8c434e7dcaa013a1cf0c44c489: Status 404 returned error can't find the container with id 10678956b24a0300ea21b1fafa823afb612b1c8c434e7dcaa013a1cf0c44c489 Apr 22 15:08:54.873637 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.873569 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nrz6v"] Apr 22 15:08:54.876063 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:54.876039 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e31d3f_d753_47a0_b9ff_ad56f2deb44a.slice/crio-e1b0ef729160ce3bced58325d735c9eaba8161a72045c2f6b68287f40989e2b1 WatchSource:0}: Error finding container e1b0ef729160ce3bced58325d735c9eaba8161a72045c2f6b68287f40989e2b1: Status 404 returned error can't find the container with id e1b0ef729160ce3bced58325d735c9eaba8161a72045c2f6b68287f40989e2b1 Apr 22 15:08:54.922860 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:54.922828 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hll6j"] Apr 22 15:08:54.925661 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:54.925640 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f36f114_7050_456f_a137_b827c10f5102.slice/crio-d5e11d33dc190c330142b0bfb2155e9e2a068533eb47332f537c0c283ce84e51 WatchSource:0}: Error finding container d5e11d33dc190c330142b0bfb2155e9e2a068533eb47332f537c0c283ce84e51: Status 404 returned error can't find the container with id d5e11d33dc190c330142b0bfb2155e9e2a068533eb47332f537c0c283ce84e51 Apr 22 15:08:55.059191 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.057049 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:08:55.067070 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.067049 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:08:55.067070 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.067050 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pdgcv\"" Apr 22 15:08:55.111949 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.111875 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v2crv"] Apr 22 15:08:55.124507 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.124484 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.131735 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.131662 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:08:55.131829 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.131749 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 15:08:55.135631 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.135377 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v2crv"] Apr 22 15:08:55.139695 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.139661 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 15:08:55.139926 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.139909 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:08:55.140012 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.139922 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:08:55.140012 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.139972 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h8mnx\"" Apr 22 15:08:55.209855 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.209812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.210035 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.209866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhsgg\" (UniqueName: \"kubernetes.io/projected/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-kube-api-access-vhsgg\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.210035 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.209901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.210035 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.209972 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.226591 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.226555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76xbb" event={"ID":"31b91dc6-451c-4694-8db8-6ef3dcecbf4c","Type":"ContainerStarted","Data":"10678956b24a0300ea21b1fafa823afb612b1c8c434e7dcaa013a1cf0c44c489"} Apr 22 15:08:55.227921 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.227897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" event={"ID":"9de5e468-cf8f-4904-9728-9f97bf669789","Type":"ContainerStarted","Data":"cfee1b62cab7882ff8eed4b356435bce80086283ce6d6b3012c746942d48c6e3"} Apr 22 15:08:55.228038 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.227929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" event={"ID":"9de5e468-cf8f-4904-9728-9f97bf669789","Type":"ContainerStarted","Data":"92a534468b1e902e4822c20155b1792b689d4237b8b1b9218000f9b4cb9cc615"} Apr 22 15:08:55.228116 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.228098 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:08:55.229257 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.229237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hll6j" event={"ID":"4f36f114-7050-456f-a137-b827c10f5102","Type":"ContainerStarted","Data":"63920b877341fd61f353aeb4491f316099b0b167f0cbda0be536f49c686e509d"} Apr 22 15:08:55.229355 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.229261 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hll6j" event={"ID":"4f36f114-7050-456f-a137-b827c10f5102","Type":"ContainerStarted","Data":"d5e11d33dc190c330142b0bfb2155e9e2a068533eb47332f537c0c283ce84e51"} Apr 22 15:08:55.232289 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.232270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" event={"ID":"49c61d54-9288-4695-8a61-293644b9038e","Type":"ContainerStarted","Data":"92864d9ebc376c78df3712279344c8f89b9bf4250f246b6ed366dfff41de846c"} Apr 22 15:08:55.233243 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.233219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nrz6v" event={"ID":"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a","Type":"ContainerStarted","Data":"e1b0ef729160ce3bced58325d735c9eaba8161a72045c2f6b68287f40989e2b1"} Apr 22 15:08:55.276373 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.276330 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" podStartSLOduration=6.27631744 podStartE2EDuration="6.27631744s" podCreationTimestamp="2026-04-22 15:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:55.275294203 +0000 UTC m=+38.812465917" watchObservedRunningTime="2026-04-22 15:08:55.27631744 +0000 UTC m=+38.813489181" Apr 22 15:08:55.310912 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.310873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.311153 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.311131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.311230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.311204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhsgg\" (UniqueName: \"kubernetes.io/projected/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-kube-api-access-vhsgg\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.311285 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:55.311248 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 15:08:55.311327 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:08:55.311319 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls podName:07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:55.811298738 +0000 UTC m=+39.348470450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-v2crv" (UID: "07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29") : secret "prometheus-operator-tls" not found Apr 22 15:08:55.311391 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.311254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.311730 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.311709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.314237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.314216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.327997 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.327972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhsgg\" (UniqueName: \"kubernetes.io/projected/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-kube-api-access-vhsgg\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.360724 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.360654 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qpxv8" podStartSLOduration=6.089483288 podStartE2EDuration="38.360635545s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.683136386 +0000 UTC m=+3.220308078" lastFinishedPulling="2026-04-22 15:08:51.954288644 +0000 UTC m=+35.491460335" observedRunningTime="2026-04-22 15:08:55.357876251 +0000 UTC m=+38.895047968" watchObservedRunningTime="2026-04-22 15:08:55.360635545 +0000 UTC m=+38.897807264" Apr 22 15:08:55.815511 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.815472 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:55.817910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:55.817891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v2crv\" (UID: \"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:56.032883 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.032498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" Apr 22 15:08:56.053794 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.053760 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:08:56.057358 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.056864 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqzj2\"" Apr 22 15:08:56.058208 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.057818 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:08:56.058208 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.058040 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:08:56.612138 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:56.612106 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v2crv"] Apr 22 15:08:56.616580 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:08:56.616540 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e8ba1b_86c7_46e8_a1d0_8821e0eb4c29.slice/crio-f887f6b69d4c84fb7ca79457e709ade177ac908d1239e69702013a1a0bfd19f2 WatchSource:0}: Error finding container f887f6b69d4c84fb7ca79457e709ade177ac908d1239e69702013a1a0bfd19f2: Status 404 returned error can't find the container with id f887f6b69d4c84fb7ca79457e709ade177ac908d1239e69702013a1a0bfd19f2 Apr 22 15:08:57.239141 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:57.239101 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" event={"ID":"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29","Type":"ContainerStarted","Data":"f887f6b69d4c84fb7ca79457e709ade177ac908d1239e69702013a1a0bfd19f2"} Apr 22 15:08:57.240944 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:57.240885 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hll6j" event={"ID":"4f36f114-7050-456f-a137-b827c10f5102","Type":"ContainerStarted","Data":"4e5e25c7b7486aaf2958526d5e64319c08d51dd95762069ea6c3b379e24c3fb1"} Apr 22 15:08:58.246041 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.245997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nrz6v" event={"ID":"f8e31d3f-d753-47a0-b9ff-ad56f2deb44a","Type":"ContainerStarted","Data":"aa534ce5d72ee840971a9da933c38cacb659d8cadabb51940dc8b1e192bbb354"} Apr 22 15:08:58.247932 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.247899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76xbb" event={"ID":"31b91dc6-451c-4694-8db8-6ef3dcecbf4c","Type":"ContainerStarted","Data":"cf7f9ff5be1ba21249e5fd66f7f9cd255b972fd2c4021e78e136738c0bba41bc"} Apr 22 15:08:58.248080 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.247940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-76xbb" event={"ID":"31b91dc6-451c-4694-8db8-6ef3dcecbf4c","Type":"ContainerStarted","Data":"83279cb449620d6a007d78fbcc20e6a7e0d0207f5cf575474e096116cf351cb3"} Apr 22 15:08:58.248080 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.248045 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-76xbb" Apr 22 15:08:58.267248 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.267192 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nrz6v" podStartSLOduration=1.769072749 podStartE2EDuration="4.267169778s" podCreationTimestamp="2026-04-22 15:08:54 +0000 UTC" firstStartedPulling="2026-04-22 15:08:54.877949391 +0000 UTC m=+38.415121092" lastFinishedPulling="2026-04-22 15:08:57.376046414 +0000 UTC m=+40.913218121" observedRunningTime="2026-04-22 15:08:58.266410264 +0000 UTC m=+41.803581989" watchObservedRunningTime="2026-04-22 15:08:58.267169778 +0000 UTC m=+41.804341492" Apr 22 15:08:58.286509 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:58.286444 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-76xbb" podStartSLOduration=1.816658414 podStartE2EDuration="4.28642654s" podCreationTimestamp="2026-04-22 15:08:54 +0000 UTC" firstStartedPulling="2026-04-22 15:08:54.862111313 +0000 UTC m=+38.399283012" lastFinishedPulling="2026-04-22 15:08:57.331879445 +0000 UTC m=+40.869051138" observedRunningTime="2026-04-22 15:08:58.286142613 +0000 UTC m=+41.823314329" watchObservedRunningTime="2026-04-22 15:08:58.28642654 +0000 UTC m=+41.823598254" Apr 22 15:08:59.252816 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:59.252776 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" event={"ID":"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29","Type":"ContainerStarted","Data":"dfdc70f6c92fa2e1e88f7cb1b04304982c2f796ae959ed97af1fad81db889d87"} Apr 22 15:08:59.252816 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:59.252816 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" event={"ID":"07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29","Type":"ContainerStarted","Data":"d198f95c304ac46bf2d07936d311a0a9d4d798b12536a088ab48e0b7f66e8e2d"} Apr 22 15:08:59.254651 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:59.254626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hll6j" event={"ID":"4f36f114-7050-456f-a137-b827c10f5102","Type":"ContainerStarted","Data":"59821a1738a10bb3191b24d4421eb13dbcbc83dd8070765d8abf339372ab6352"} Apr 22 15:08:59.288976 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:59.288922 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-v2crv" podStartSLOduration=2.089820712 podStartE2EDuration="4.288905877s" podCreationTimestamp="2026-04-22 15:08:55 +0000 UTC" firstStartedPulling="2026-04-22 15:08:56.618568428 +0000 UTC m=+40.155740135" lastFinishedPulling="2026-04-22 15:08:58.817653606 +0000 UTC m=+42.354825300" observedRunningTime="2026-04-22 15:08:59.287353315 +0000 UTC m=+42.824525030" watchObservedRunningTime="2026-04-22 15:08:59.288905877 +0000 UTC m=+42.826077582" Apr 22 15:08:59.333825 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:08:59.333772 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hll6j" podStartSLOduration=1.5034405400000002 podStartE2EDuration="5.333756657s" podCreationTimestamp="2026-04-22 15:08:54 +0000 UTC" firstStartedPulling="2026-04-22 15:08:54.986471261 +0000 UTC m=+38.523642954" lastFinishedPulling="2026-04-22 15:08:58.816787378 +0000 UTC m=+42.353959071" observedRunningTime="2026-04-22 15:08:59.333577873 +0000 UTC m=+42.870749584" watchObservedRunningTime="2026-04-22 15:08:59.333756657 +0000 UTC m=+42.870928370" Apr 22 15:09:01.558992 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.558958 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rzkn2"] Apr 22 15:09:01.565051 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.565022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.568334 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.568306 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2fn2s\"" Apr 22 15:09:01.568475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.568306 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:09:01.568522 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.568474 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:09:01.568741 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.568722 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:09:01.664161 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-metrics-client-ca\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-textfile\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664238 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-accelerators-collector-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-root\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664377 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-sys\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-tls\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jzn\" (UniqueName: \"kubernetes.io/projected/74e35d6d-6b97-4484-8bf5-c855372fc51b-kube-api-access-q6jzn\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.664472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.664440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-wtmp\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765125 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-wtmp\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-metrics-client-ca\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-textfile\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-accelerators-collector-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765322 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-wtmp\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-root\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-sys\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-tls\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jzn\" (UniqueName: \"kubernetes.io/projected/74e35d6d-6b97-4484-8bf5-c855372fc51b-kube-api-access-q6jzn\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765518 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-sys\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765841 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-textfile\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765963 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74e35d6d-6b97-4484-8bf5-c855372fc51b-root\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765963 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-metrics-client-ca\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.765963 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.765933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-accelerators-collector-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.768477 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.768448 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.768582 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.768484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74e35d6d-6b97-4484-8bf5-c855372fc51b-node-exporter-tls\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.776046 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.776020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jzn\" (UniqueName: \"kubernetes.io/projected/74e35d6d-6b97-4484-8bf5-c855372fc51b-kube-api-access-q6jzn\") pod \"node-exporter-rzkn2\" (UID: \"74e35d6d-6b97-4484-8bf5-c855372fc51b\") " pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.874621 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:01.874584 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rzkn2" Apr 22 15:09:01.882910 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:01.882872 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e35d6d_6b97_4484_8bf5_c855372fc51b.slice/crio-9f5fd18c26eb7b5dcf2f4b078a2792217c29959213aa9769d89f16af537cd364 WatchSource:0}: Error finding container 9f5fd18c26eb7b5dcf2f4b078a2792217c29959213aa9769d89f16af537cd364: Status 404 returned error can't find the container with id 9f5fd18c26eb7b5dcf2f4b078a2792217c29959213aa9769d89f16af537cd364 Apr 22 15:09:02.264044 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:02.263958 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rzkn2" event={"ID":"74e35d6d-6b97-4484-8bf5-c855372fc51b","Type":"ContainerStarted","Data":"9f5fd18c26eb7b5dcf2f4b078a2792217c29959213aa9769d89f16af537cd364"} Apr 22 15:09:03.267576 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.267540 2569 generic.go:358] "Generic (PLEG): container finished" podID="74e35d6d-6b97-4484-8bf5-c855372fc51b" containerID="e676d3f9fad00b6898ab48a404569191adbcb241582161987099ac02792bef8b" exitCode=0 Apr 22 15:09:03.268017 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.267623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rzkn2" event={"ID":"74e35d6d-6b97-4484-8bf5-c855372fc51b","Type":"ContainerDied","Data":"e676d3f9fad00b6898ab48a404569191adbcb241582161987099ac02792bef8b"} Apr 22 15:09:03.655397 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.655362 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-687bddbc7b-vr52v"] Apr 22 15:09:03.658926 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.658900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.663489 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.663468 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 15:09:03.663700 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.663659 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-43b6omg0ioa4s\"" Apr 22 15:09:03.663804 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.663785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 15:09:03.663863 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.663793 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 15:09:03.664090 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.664069 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 15:09:03.664190 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.664092 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 15:09:03.664190 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.664115 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-blmj8\"" Apr 22 15:09:03.678309 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.678281 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-687bddbc7b-vr52v"] Apr 22 15:09:03.682154 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682254 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-grpc-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682311 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682368 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1938607c-447d-4b0d-996a-713970b3d71a-metrics-client-ca\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682505 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682505 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x44sc\" (UniqueName: \"kubernetes.io/projected/1938607c-447d-4b0d-996a-713970b3d71a-kube-api-access-x44sc\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682505 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.682505 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.682450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783578 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783578 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x44sc\" (UniqueName: \"kubernetes.io/projected/1938607c-447d-4b0d-996a-713970b3d71a-kube-api-access-x44sc\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783751 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-grpc-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.783871 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.783792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1938607c-447d-4b0d-996a-713970b3d71a-metrics-client-ca\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.786328 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.786274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1938607c-447d-4b0d-996a-713970b3d71a-metrics-client-ca\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.786808 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.786757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.786946 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.786886 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.786946 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.786934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-grpc-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.787047 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.786956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.787134 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.787115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.787170 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.787145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1938607c-447d-4b0d-996a-713970b3d71a-secret-thanos-querier-tls\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.799774 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.799754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x44sc\" (UniqueName: \"kubernetes.io/projected/1938607c-447d-4b0d-996a-713970b3d71a-kube-api-access-x44sc\") pod \"thanos-querier-687bddbc7b-vr52v\" (UID: \"1938607c-447d-4b0d-996a-713970b3d71a\") " pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:03.968440 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:03.968334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:04.097462 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:04.095265 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-687bddbc7b-vr52v"] Apr 22 15:09:04.100239 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:04.100215 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1938607c_447d_4b0d_996a_713970b3d71a.slice/crio-2270c78ada71f8d2fd17f4bdb901f50754141ba7c39d6254140049d41052b00c WatchSource:0}: Error finding container 2270c78ada71f8d2fd17f4bdb901f50754141ba7c39d6254140049d41052b00c: Status 404 returned error can't find the container with id 2270c78ada71f8d2fd17f4bdb901f50754141ba7c39d6254140049d41052b00c Apr 22 15:09:04.271747 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:04.271642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rzkn2" event={"ID":"74e35d6d-6b97-4484-8bf5-c855372fc51b","Type":"ContainerStarted","Data":"e97b1ad3af76ecfdaff3d4aa4567d1748b4c2dececa5bae63a1679014d7af9a2"} Apr 22 15:09:04.271747 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:04.271711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rzkn2" event={"ID":"74e35d6d-6b97-4484-8bf5-c855372fc51b","Type":"ContainerStarted","Data":"a62101e23f09f05083776b99fc3330739461ba9a1452ecc3642e5aba0f74342b"} Apr 22 15:09:04.272711 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:04.272675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"2270c78ada71f8d2fd17f4bdb901f50754141ba7c39d6254140049d41052b00c"} Apr 22 15:09:04.293774 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:04.293726 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rzkn2" podStartSLOduration=2.637643449 podStartE2EDuration="3.293711246s" podCreationTimestamp="2026-04-22 15:09:01 +0000 UTC" firstStartedPulling="2026-04-22 15:09:01.884867589 +0000 UTC m=+45.422039280" lastFinishedPulling="2026-04-22 15:09:02.540935385 +0000 UTC m=+46.078107077" observedRunningTime="2026-04-22 15:09:04.292269387 +0000 UTC m=+47.829441101" watchObservedRunningTime="2026-04-22 15:09:04.293711246 +0000 UTC m=+47.830882954" Apr 22 15:09:06.115401 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.115364 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59d69cf5d4-b9d5b"] Apr 22 15:09:06.118910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.118882 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.122572 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.122531 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 15:09:06.123843 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.123776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:09:06.123970 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.123846 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 15:09:06.123970 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.123924 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zvn49\"" Apr 22 15:09:06.124076 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.124000 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-eb3pp787rejhn\"" Apr 22 15:09:06.124076 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.124057 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 15:09:06.147246 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.147216 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59d69cf5d4-b9d5b"] Apr 22 15:09:06.204048 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204000 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204245 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e23db7f1-d665-4550-8536-b62b4fc7f499-audit-log\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204245 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-tls\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204245 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thf6d\" (UniqueName: \"kubernetes.io/projected/e23db7f1-d665-4550-8536-b62b4fc7f499-kube-api-access-thf6d\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204245 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-client-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204452 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-client-certs\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.204452 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.204344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-metrics-server-audit-profiles\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305215 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-tls\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305215 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thf6d\" (UniqueName: \"kubernetes.io/projected/e23db7f1-d665-4550-8536-b62b4fc7f499-kube-api-access-thf6d\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305248 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-client-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-client-certs\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-metrics-server-audit-profiles\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305475 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e23db7f1-d665-4550-8536-b62b4fc7f499-audit-log\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.305862 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.305815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e23db7f1-d665-4550-8536-b62b4fc7f499-audit-log\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.306374 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.306327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.306588 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.306567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e23db7f1-d665-4550-8536-b62b4fc7f499-metrics-server-audit-profiles\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.308256 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.308233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-tls\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.308530 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.308504 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-secret-metrics-server-client-certs\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.308855 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.308819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23db7f1-d665-4550-8536-b62b4fc7f499-client-ca-bundle\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.335673 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.335632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thf6d\" (UniqueName: \"kubernetes.io/projected/e23db7f1-d665-4550-8536-b62b4fc7f499-kube-api-access-thf6d\") pod \"metrics-server-59d69cf5d4-b9d5b\" (UID: \"e23db7f1-d665-4550-8536-b62b4fc7f499\") " pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.370160 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.370073 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98"] Apr 22 15:09:06.374836 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.374795 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:06.379009 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.378986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 15:09:06.382562 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.382536 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-59qkh\"" Apr 22 15:09:06.395076 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.395053 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98"] Apr 22 15:09:06.406323 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.406286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6c91dae2-8148-44b5-ab33-56ca9f634448-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rwr98\" (UID: \"6c91dae2-8148-44b5-ab33-56ca9f634448\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:06.430309 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.430280 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:06.508027 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.507629 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6c91dae2-8148-44b5-ab33-56ca9f634448-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rwr98\" (UID: \"6c91dae2-8148-44b5-ab33-56ca9f634448\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:06.510055 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.510034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6c91dae2-8148-44b5-ab33-56ca9f634448-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rwr98\" (UID: \"6c91dae2-8148-44b5-ab33-56ca9f634448\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:06.584421 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.584394 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59d69cf5d4-b9d5b"] Apr 22 15:09:06.588130 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:06.588104 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23db7f1_d665_4550_8536_b62b4fc7f499.slice/crio-a88d263e5de8cb8b2e09b74b5262f5a45ec25c039528b78454330fd60b2dd4cf WatchSource:0}: Error finding container a88d263e5de8cb8b2e09b74b5262f5a45ec25c039528b78454330fd60b2dd4cf: Status 404 returned error can't find the container with id a88d263e5de8cb8b2e09b74b5262f5a45ec25c039528b78454330fd60b2dd4cf Apr 22 15:09:06.687164 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.687126 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:06.811032 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:06.810958 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98"] Apr 22 15:09:06.812977 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:06.812956 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c91dae2_8148_44b5_ab33_56ca9f634448.slice/crio-ed0cac356dbca2c4bcabd91a06a90457098eaea06ac05bc4bf20b41ce282f266 WatchSource:0}: Error finding container ed0cac356dbca2c4bcabd91a06a90457098eaea06ac05bc4bf20b41ce282f266: Status 404 returned error can't find the container with id ed0cac356dbca2c4bcabd91a06a90457098eaea06ac05bc4bf20b41ce282f266 Apr 22 15:09:07.285096 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:07.285056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"f7ba2b37e04ed11daa569902d795fb68fd6b05b4a39547b85ec1d7edf0787c7b"} Apr 22 15:09:07.285096 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:07.285105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"14c3b4236dacd9e6c3be42489da6bf1c21b3196f26ac8ac6b31430b3a21172af"} Apr 22 15:09:07.285617 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:07.285116 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"bbecdb036af17ec40cf349fe8efdb9701cabbdd31ca9dcf2cf3108a26729ec1f"} Apr 22 15:09:07.286546 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:07.286515 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" event={"ID":"e23db7f1-d665-4550-8536-b62b4fc7f499","Type":"ContainerStarted","Data":"a88d263e5de8cb8b2e09b74b5262f5a45ec25c039528b78454330fd60b2dd4cf"} Apr 22 15:09:07.287856 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:07.287828 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" event={"ID":"6c91dae2-8148-44b5-ab33-56ca9f634448","Type":"ContainerStarted","Data":"ed0cac356dbca2c4bcabd91a06a90457098eaea06ac05bc4bf20b41ce282f266"} Apr 22 15:09:08.171446 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.171412 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:08.207621 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.207585 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:08.207799 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.207708 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.212305 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.212278 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:09:08.212452 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.212431 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 15:09:08.212514 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.212434 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 15:09:08.212953 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.212934 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 15:09:08.213067 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.212938 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:09:08.213528 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.213511 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 15:09:08.213582 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.213546 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 15:09:08.214132 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.214113 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jtqk8\"" Apr 22 15:09:08.222510 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.222492 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 15:09:08.223481 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223542 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223542 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbmg2\" (UniqueName: \"kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223647 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223647 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223615 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223760 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.223814 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.223775 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.257418 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.257389 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-76xbb" Apr 22 15:09:08.306060 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.306020 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"177ff8b12168354399fbfe26a743683c19228eec2114f65b8304cc24e0158cc5"} Apr 22 15:09:08.306060 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.306060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"a9dcac3f0325a6ba6af6714722bcf169b760b56207b8214e04156c8061f9cc9d"} Apr 22 15:09:08.306471 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.306076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" event={"ID":"1938607c-447d-4b0d-996a-713970b3d71a","Type":"ContainerStarted","Data":"03af0c43202163a6bfaaf40bc8123bc7f0db4e11bcb17d5ce1599c745b82fb2e"} Apr 22 15:09:08.306471 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.306284 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:08.325055 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbmg2\" (UniqueName: \"kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325055 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325260 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325260 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325260 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325375 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.325462 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.325440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.326219 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.326186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.326903 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.326877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.327533 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.327504 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.329125 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.329080 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.329125 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.329099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.338809 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.338782 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.340640 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.340556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbmg2\" (UniqueName: \"kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2\") pod \"console-78754fd6f8-ww5dm\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:08.517274 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:08.517180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:09.014791 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.014673 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" podStartSLOduration=2.556880645 podStartE2EDuration="6.014650656s" podCreationTimestamp="2026-04-22 15:09:03 +0000 UTC" firstStartedPulling="2026-04-22 15:09:04.102151561 +0000 UTC m=+47.639323253" lastFinishedPulling="2026-04-22 15:09:07.559921544 +0000 UTC m=+51.097093264" observedRunningTime="2026-04-22 15:09:08.351111438 +0000 UTC m=+51.888283155" watchObservedRunningTime="2026-04-22 15:09:09.014650656 +0000 UTC m=+52.551822370" Apr 22 15:09:09.014964 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.014946 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:09.017654 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:09.017625 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0072511_073e_4b6c_a7a6_066fa74b52f8.slice/crio-af441e339f5f4a4ad200b359165289830b7ac0a5f32842151ca558faa640c8d3 WatchSource:0}: Error finding container af441e339f5f4a4ad200b359165289830b7ac0a5f32842151ca558faa640c8d3: Status 404 returned error can't find the container with id af441e339f5f4a4ad200b359165289830b7ac0a5f32842151ca558faa640c8d3 Apr 22 15:09:09.311590 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.311500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" event={"ID":"e23db7f1-d665-4550-8536-b62b4fc7f499","Type":"ContainerStarted","Data":"d66cb992e0b48103fb5bb94127cd2e5b9507b8790086b28a67b86a95060bdd84"} Apr 22 15:09:09.312855 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.312821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" event={"ID":"6c91dae2-8148-44b5-ab33-56ca9f634448","Type":"ContainerStarted","Data":"94401221e735e2948da0bda835e055eb74c7758a9a095a2536dee3a13bac5fa0"} Apr 22 15:09:09.313056 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.313033 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:09.314025 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.314004 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78754fd6f8-ww5dm" event={"ID":"c0072511-073e-4b6c-a7a6-066fa74b52f8","Type":"ContainerStarted","Data":"af441e339f5f4a4ad200b359165289830b7ac0a5f32842151ca558faa640c8d3"} Apr 22 15:09:09.318401 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.318380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" Apr 22 15:09:09.333366 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.332979 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" podStartSLOduration=1.033408828 podStartE2EDuration="3.332960602s" podCreationTimestamp="2026-04-22 15:09:06 +0000 UTC" firstStartedPulling="2026-04-22 15:09:06.590001479 +0000 UTC m=+50.127173170" lastFinishedPulling="2026-04-22 15:09:08.889553242 +0000 UTC m=+52.426724944" observedRunningTime="2026-04-22 15:09:09.33127073 +0000 UTC m=+52.868442448" watchObservedRunningTime="2026-04-22 15:09:09.332960602 +0000 UTC m=+52.870132317" Apr 22 15:09:09.349065 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:09.349012 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rwr98" podStartSLOduration=1.272015976 podStartE2EDuration="3.348995555s" podCreationTimestamp="2026-04-22 15:09:06 +0000 UTC" firstStartedPulling="2026-04-22 15:09:06.814782269 +0000 UTC m=+50.351953961" lastFinishedPulling="2026-04-22 15:09:08.891761835 +0000 UTC m=+52.428933540" observedRunningTime="2026-04-22 15:09:09.347786761 +0000 UTC m=+52.884958474" watchObservedRunningTime="2026-04-22 15:09:09.348995555 +0000 UTC m=+52.886167274" Apr 22 15:09:12.334784 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.334753 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:09:12.348043 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.348019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.350769 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.350744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:09:12.464760 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.464888 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464763 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.464888 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464797 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.464888 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.464888 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464886 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.465027 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.465027 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.464941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qd9x\" (UniqueName: \"kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566108 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qd9x\" (UniqueName: \"kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566108 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566252 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.566351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.566317 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.567324 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.567264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.567457 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.567414 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.567734 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.567668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.567878 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.567859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.575389 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.575362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qd9x\" (UniqueName: \"kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.581824 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.581773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.581946 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.581892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert\") pod \"console-79794f7c48-n6p5w\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.657909 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.657820 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:12.797066 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:12.796994 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:09:12.799794 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:12.799752 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7908ea56_50d3_4383_8313_2d36f77a6fc4.slice/crio-071347c411882fac44e118cec69993df22bc40e8751c5362dbd79ae29228c908 WatchSource:0}: Error finding container 071347c411882fac44e118cec69993df22bc40e8751c5362dbd79ae29228c908: Status 404 returned error can't find the container with id 071347c411882fac44e118cec69993df22bc40e8751c5362dbd79ae29228c908 Apr 22 15:09:13.327677 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:13.327582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78754fd6f8-ww5dm" event={"ID":"c0072511-073e-4b6c-a7a6-066fa74b52f8","Type":"ContainerStarted","Data":"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b"} Apr 22 15:09:13.329072 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:13.329040 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79794f7c48-n6p5w" event={"ID":"7908ea56-50d3-4383-8313-2d36f77a6fc4","Type":"ContainerStarted","Data":"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f"} Apr 22 15:09:13.329072 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:13.329069 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79794f7c48-n6p5w" event={"ID":"7908ea56-50d3-4383-8313-2d36f77a6fc4","Type":"ContainerStarted","Data":"071347c411882fac44e118cec69993df22bc40e8751c5362dbd79ae29228c908"} Apr 22 15:09:13.345878 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:13.345827 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78754fd6f8-ww5dm" podStartSLOduration=1.962075343 podStartE2EDuration="5.345808981s" podCreationTimestamp="2026-04-22 15:09:08 +0000 UTC" firstStartedPulling="2026-04-22 15:09:09.019588714 +0000 UTC m=+52.556760405" lastFinishedPulling="2026-04-22 15:09:12.403322338 +0000 UTC m=+55.940494043" observedRunningTime="2026-04-22 15:09:13.345355414 +0000 UTC m=+56.882527127" watchObservedRunningTime="2026-04-22 15:09:13.345808981 +0000 UTC m=+56.882980696" Apr 22 15:09:13.363351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:13.363304 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79794f7c48-n6p5w" podStartSLOduration=1.363289918 podStartE2EDuration="1.363289918s" podCreationTimestamp="2026-04-22 15:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:13.362533034 +0000 UTC m=+56.899704749" watchObservedRunningTime="2026-04-22 15:09:13.363289918 +0000 UTC m=+56.900461631" Apr 22 15:09:14.321616 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:14.321588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-687bddbc7b-vr52v" Apr 22 15:09:16.239933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:16.239905 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cdb9c7cd9-75b2w" Apr 22 15:09:18.518178 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:18.518140 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:18.518583 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:18.518194 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:18.523173 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:18.523149 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:19.042257 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:19.042232 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cq6md" Apr 22 15:09:19.349528 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:19.349496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:22.658486 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.658449 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:22.658877 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.658532 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:22.663247 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.663227 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:22.755563 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.755529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:09:22.758358 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.758336 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:09:22.768676 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.768657 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d868504-055f-463c-b932-801175d669c7-metrics-certs\") pod \"network-metrics-daemon-hzm72\" (UID: \"9d868504-055f-463c-b932-801175d669c7\") " pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:09:22.856382 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.856352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:09:22.859117 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.859099 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:09:22.870001 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.869981 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:09:22.880561 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.880538 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkpt\" (UniqueName: \"kubernetes.io/projected/987b673f-d105-40ae-8ec9-b8dab14f068f-kube-api-access-xmkpt\") pod \"network-check-target-wfpb4\" (UID: \"987b673f-d105-40ae-8ec9-b8dab14f068f\") " pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:09:22.969983 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.969917 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pdgcv\"" Apr 22 15:09:22.977314 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:22.977292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hzm72" Apr 22 15:09:23.068861 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.068834 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqzj2\"" Apr 22 15:09:23.076339 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.076316 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:09:23.113606 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.113575 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hzm72"] Apr 22 15:09:23.116899 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:23.116875 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d868504_055f_463c_b932_801175d669c7.slice/crio-602e6cef7619286f92301a8097545ef719b224fab20a5d216b66b16414f261f9 WatchSource:0}: Error finding container 602e6cef7619286f92301a8097545ef719b224fab20a5d216b66b16414f261f9: Status 404 returned error can't find the container with id 602e6cef7619286f92301a8097545ef719b224fab20a5d216b66b16414f261f9 Apr 22 15:09:23.191710 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.191664 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wfpb4"] Apr 22 15:09:23.194393 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:23.194365 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987b673f_d105_40ae_8ec9_b8dab14f068f.slice/crio-1e1faad9c41a7959c8083cc5f3b6af2d0c986fc1b7781d34b1bb2ee1643c0d80 WatchSource:0}: Error finding container 1e1faad9c41a7959c8083cc5f3b6af2d0c986fc1b7781d34b1bb2ee1643c0d80: Status 404 returned error can't find the container with id 1e1faad9c41a7959c8083cc5f3b6af2d0c986fc1b7781d34b1bb2ee1643c0d80 Apr 22 15:09:23.356516 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.356479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wfpb4" event={"ID":"987b673f-d105-40ae-8ec9-b8dab14f068f","Type":"ContainerStarted","Data":"1e1faad9c41a7959c8083cc5f3b6af2d0c986fc1b7781d34b1bb2ee1643c0d80"} Apr 22 15:09:23.357526 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.357497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzm72" event={"ID":"9d868504-055f-463c-b932-801175d669c7","Type":"ContainerStarted","Data":"602e6cef7619286f92301a8097545ef719b224fab20a5d216b66b16414f261f9"} Apr 22 15:09:23.361383 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.361361 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:09:23.410626 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:23.410596 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:25.368283 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:25.368245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzm72" event={"ID":"9d868504-055f-463c-b932-801175d669c7","Type":"ContainerStarted","Data":"ae375b565ede38433ed870af0cb7e1d8d1c21c47128317bd2a5c18db020155d5"} Apr 22 15:09:25.368764 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:25.368289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hzm72" event={"ID":"9d868504-055f-463c-b932-801175d669c7","Type":"ContainerStarted","Data":"7363bb2faf3c2cb0bcc45284dc56526824e393e2ea8413b9d58ef34877c4f0fa"} Apr 22 15:09:25.388104 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:25.388052 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hzm72" podStartSLOduration=67.103052821 podStartE2EDuration="1m8.388033759s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:09:23.119213698 +0000 UTC m=+66.656385390" lastFinishedPulling="2026-04-22 15:09:24.404194637 +0000 UTC m=+67.941366328" observedRunningTime="2026-04-22 15:09:25.386957207 +0000 UTC m=+68.924128922" watchObservedRunningTime="2026-04-22 15:09:25.388033759 +0000 UTC m=+68.925205478" Apr 22 15:09:26.372405 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:26.372366 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wfpb4" event={"ID":"987b673f-d105-40ae-8ec9-b8dab14f068f","Type":"ContainerStarted","Data":"1a044f385c19e4d962926d86c3221da56731b9dda84af569aafd150bfdef3863"} Apr 22 15:09:26.372796 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:26.372764 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:09:26.393004 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:26.392964 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wfpb4" podStartSLOduration=66.543807272 podStartE2EDuration="1m9.392953012s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:09:23.19647146 +0000 UTC m=+66.733643152" lastFinishedPulling="2026-04-22 15:09:26.045617199 +0000 UTC m=+69.582788892" observedRunningTime="2026-04-22 15:09:26.391318248 +0000 UTC m=+69.928489962" watchObservedRunningTime="2026-04-22 15:09:26.392953012 +0000 UTC m=+69.930124726" Apr 22 15:09:26.430956 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:26.430923 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:26.431078 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:26.430999 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:46.436766 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:46.436644 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:46.440564 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:46.440541 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59d69cf5d4-b9d5b" Apr 22 15:09:47.218388 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.218350 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:09:47.227952 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.227923 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.231926 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.231894 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:09:47.346237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346237 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346238 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346512 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346512 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346512 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346512 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346407 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5cg\" (UniqueName: \"kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.346512 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.346428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447059 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447059 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447558 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447558 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447558 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447558 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5cg\" (UniqueName: \"kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447558 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447207 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447878 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447850 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.447944 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.448020 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.447998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.448092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.448076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.450087 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.450067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.450187 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.450168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.461036 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.461012 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5cg\" (UniqueName: \"kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg\") pod \"console-74b7646c6b-c6g4t\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.538268 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.538182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:47.656612 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:47.656586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:09:47.659347 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:09:47.659317 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc98e9f_7567_4da1_99f2_2ace8953c61f.slice/crio-148259bf113da653e908ca063b8040a0589dd49db06719aa99b73aede751f40c WatchSource:0}: Error finding container 148259bf113da653e908ca063b8040a0589dd49db06719aa99b73aede751f40c: Status 404 returned error can't find the container with id 148259bf113da653e908ca063b8040a0589dd49db06719aa99b73aede751f40c Apr 22 15:09:48.432914 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.432879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b7646c6b-c6g4t" event={"ID":"6cc98e9f-7567-4da1-99f2-2ace8953c61f","Type":"ContainerStarted","Data":"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821"} Apr 22 15:09:48.432914 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.432919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b7646c6b-c6g4t" event={"ID":"6cc98e9f-7567-4da1-99f2-2ace8953c61f","Type":"ContainerStarted","Data":"148259bf113da653e908ca063b8040a0589dd49db06719aa99b73aede751f40c"} Apr 22 15:09:48.434408 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.434381 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78754fd6f8-ww5dm" podUID="c0072511-073e-4b6c-a7a6-066fa74b52f8" containerName="console" containerID="cri-o://a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b" gracePeriod=15 Apr 22 15:09:48.451062 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.451010 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74b7646c6b-c6g4t" podStartSLOduration=1.450990797 podStartE2EDuration="1.450990797s" podCreationTimestamp="2026-04-22 15:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:48.449546646 +0000 UTC m=+91.986718360" watchObservedRunningTime="2026-04-22 15:09:48.450990797 +0000 UTC m=+91.988162490" Apr 22 15:09:48.669260 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.669237 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78754fd6f8-ww5dm_c0072511-073e-4b6c-a7a6-066fa74b52f8/console/0.log" Apr 22 15:09:48.669390 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.669312 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:48.858389 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858355 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858403 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858426 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858473 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858510 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858543 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbmg2\" (UniqueName: \"kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858570 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858567 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config\") pod \"c0072511-073e-4b6c-a7a6-066fa74b52f8\" (UID: \"c0072511-073e-4b6c-a7a6-066fa74b52f8\") " Apr 22 15:09:48.858940 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858914 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca" (OuterVolumeSpecName: "service-ca") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:48.859006 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.858940 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:48.859087 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.859062 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config" (OuterVolumeSpecName: "console-config") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:48.859140 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.859072 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:48.860908 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.860880 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:48.861002 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.860928 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2" (OuterVolumeSpecName: "kube-api-access-nbmg2") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "kube-api-access-nbmg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:48.861002 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.860945 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c0072511-073e-4b6c-a7a6-066fa74b52f8" (UID: "c0072511-073e-4b6c-a7a6-066fa74b52f8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:48.959596 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959548 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbmg2\" (UniqueName: \"kubernetes.io/projected/c0072511-073e-4b6c-a7a6-066fa74b52f8-kube-api-access-nbmg2\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959596 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959586 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-oauth-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959596 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959596 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-service-ca\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959596 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959607 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-oauth-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959889 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959616 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-trusted-ca-bundle\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959889 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959624 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:48.959889 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:48.959632 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072511-073e-4b6c-a7a6-066fa74b52f8-console-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:09:49.436704 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436653 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78754fd6f8-ww5dm_c0072511-073e-4b6c-a7a6-066fa74b52f8/console/0.log" Apr 22 15:09:49.436881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436715 2569 generic.go:358] "Generic (PLEG): container finished" podID="c0072511-073e-4b6c-a7a6-066fa74b52f8" containerID="a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b" exitCode=2 Apr 22 15:09:49.436881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78754fd6f8-ww5dm" event={"ID":"c0072511-073e-4b6c-a7a6-066fa74b52f8","Type":"ContainerDied","Data":"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b"} Apr 22 15:09:49.436881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78754fd6f8-ww5dm" event={"ID":"c0072511-073e-4b6c-a7a6-066fa74b52f8","Type":"ContainerDied","Data":"af441e339f5f4a4ad200b359165289830b7ac0a5f32842151ca558faa640c8d3"} Apr 22 15:09:49.436881 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436861 2569 scope.go:117] "RemoveContainer" containerID="a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b" Apr 22 15:09:49.437024 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.436816 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78754fd6f8-ww5dm" Apr 22 15:09:49.444710 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.444675 2569 scope.go:117] "RemoveContainer" containerID="a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b" Apr 22 15:09:49.444982 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:09:49.444965 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b\": container with ID starting with a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b not found: ID does not exist" containerID="a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b" Apr 22 15:09:49.445026 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.444989 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b"} err="failed to get container status \"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b\": rpc error: code = NotFound desc = could not find container \"a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b\": container with ID starting with a6bcff59b4b232087db49fa94ace7cf733d3b0576ee6dea99f8652f7a924eb2b not found: ID does not exist" Apr 22 15:09:49.453650 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.453623 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:49.457370 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:49.457348 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78754fd6f8-ww5dm"] Apr 22 15:09:51.058190 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:51.058153 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0072511-073e-4b6c-a7a6-066fa74b52f8" path="/var/lib/kubelet/pods/c0072511-073e-4b6c-a7a6-066fa74b52f8/volumes" Apr 22 15:09:57.539098 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:57.539056 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:57.539564 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:57.539205 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:57.543795 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:57.543774 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:58.381635 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:58.381603 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wfpb4" Apr 22 15:09:58.470454 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:58.470425 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:09:58.518134 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:09:58.518100 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:10:09.432370 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.432327 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-klhdf"] Apr 22 15:10:09.432872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.432764 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0072511-073e-4b6c-a7a6-066fa74b52f8" containerName="console" Apr 22 15:10:09.432872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.432784 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0072511-073e-4b6c-a7a6-066fa74b52f8" containerName="console" Apr 22 15:10:09.432872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.432870 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0072511-073e-4b6c-a7a6-066fa74b52f8" containerName="console" Apr 22 15:10:09.437940 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.437909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.440954 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.440928 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:10:09.444033 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.444008 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-klhdf"] Apr 22 15:10:09.516854 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.516633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-dbus\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.516854 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.516738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-kubelet-config\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.516854 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.516776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-original-pull-secret\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.619162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.618214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-kubelet-config\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.619162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.618280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-original-pull-secret\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.619162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.618345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-dbus\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.619162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.618580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-dbus\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.619162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.618655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-kubelet-config\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.622332 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.622273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1ab4beb-2b3d-4198-9103-0dcc4963bbc6-original-pull-secret\") pod \"global-pull-secret-syncer-klhdf\" (UID: \"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6\") " pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.748662 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.748568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-klhdf" Apr 22 15:10:09.888933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:09.888907 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-klhdf"] Apr 22 15:10:09.891740 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:10:09.891704 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ab4beb_2b3d_4198_9103_0dcc4963bbc6.slice/crio-50d04811951a78057708d7b70917def782f8c1d80a0cb829510980ac09feabb9 WatchSource:0}: Error finding container 50d04811951a78057708d7b70917def782f8c1d80a0cb829510980ac09feabb9: Status 404 returned error can't find the container with id 50d04811951a78057708d7b70917def782f8c1d80a0cb829510980ac09feabb9 Apr 22 15:10:10.512788 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:10.512748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-klhdf" event={"ID":"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6","Type":"ContainerStarted","Data":"50d04811951a78057708d7b70917def782f8c1d80a0cb829510980ac09feabb9"} Apr 22 15:10:14.527507 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:14.527470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-klhdf" event={"ID":"f1ab4beb-2b3d-4198-9103-0dcc4963bbc6","Type":"ContainerStarted","Data":"3eac8cf69016e753f1e431cb5db7bc8d321f906de3a1291092a459a2cf8a4739"} Apr 22 15:10:14.543318 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:14.543271 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-klhdf" podStartSLOduration=1.610445149 podStartE2EDuration="5.543255525s" podCreationTimestamp="2026-04-22 15:10:09 +0000 UTC" firstStartedPulling="2026-04-22 15:10:09.894355914 +0000 UTC m=+113.431527614" lastFinishedPulling="2026-04-22 15:10:13.827166295 +0000 UTC m=+117.364337990" observedRunningTime="2026-04-22 15:10:14.541629759 +0000 UTC m=+118.078801475" watchObservedRunningTime="2026-04-22 15:10:14.543255525 +0000 UTC m=+118.080427239" Apr 22 15:10:23.539356 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.539295 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79794f7c48-n6p5w" podUID="7908ea56-50d3-4383-8313-2d36f77a6fc4" containerName="console" containerID="cri-o://220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f" gracePeriod=15 Apr 22 15:10:23.774010 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.773978 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79794f7c48-n6p5w_7908ea56-50d3-4383-8313-2d36f77a6fc4/console/0.log" Apr 22 15:10:23.774131 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.774041 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:10:23.846634 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846609 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.846798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846648 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.846798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846668 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qd9x\" (UniqueName: \"kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.846798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846706 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.846798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846750 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.846798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846784 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.847035 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.846831 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca\") pod \"7908ea56-50d3-4383-8313-2d36f77a6fc4\" (UID: \"7908ea56-50d3-4383-8313-2d36f77a6fc4\") " Apr 22 15:10:23.847174 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.847137 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:10:23.847295 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.847262 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:10:23.847295 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.847276 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config" (OuterVolumeSpecName: "console-config") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:10:23.847410 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.847328 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca" (OuterVolumeSpecName: "service-ca") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:10:23.849039 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.849001 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:10:23.849130 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.849035 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x" (OuterVolumeSpecName: "kube-api-access-5qd9x") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "kube-api-access-5qd9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:10:23.849130 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.849068 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7908ea56-50d3-4383-8313-2d36f77a6fc4" (UID: "7908ea56-50d3-4383-8313-2d36f77a6fc4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:10:23.947351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947316 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-oauth-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947344 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-trusted-ca-bundle\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947354 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qd9x\" (UniqueName: \"kubernetes.io/projected/7908ea56-50d3-4383-8313-2d36f77a6fc4-kube-api-access-5qd9x\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947351 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947363 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-oauth-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947601 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947372 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947601 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947381 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7908ea56-50d3-4383-8313-2d36f77a6fc4-console-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:23.947601 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:23.947391 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7908ea56-50d3-4383-8313-2d36f77a6fc4-service-ca\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:10:24.559990 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.559960 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79794f7c48-n6p5w_7908ea56-50d3-4383-8313-2d36f77a6fc4/console/0.log" Apr 22 15:10:24.560403 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.560004 2569 generic.go:358] "Generic (PLEG): container finished" podID="7908ea56-50d3-4383-8313-2d36f77a6fc4" containerID="220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f" exitCode=2 Apr 22 15:10:24.560403 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.560039 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79794f7c48-n6p5w" event={"ID":"7908ea56-50d3-4383-8313-2d36f77a6fc4","Type":"ContainerDied","Data":"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f"} Apr 22 15:10:24.560403 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.560087 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79794f7c48-n6p5w" event={"ID":"7908ea56-50d3-4383-8313-2d36f77a6fc4","Type":"ContainerDied","Data":"071347c411882fac44e118cec69993df22bc40e8751c5362dbd79ae29228c908"} Apr 22 15:10:24.560403 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.560107 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79794f7c48-n6p5w" Apr 22 15:10:24.560403 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.560109 2569 scope.go:117] "RemoveContainer" containerID="220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f" Apr 22 15:10:24.568133 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.568116 2569 scope.go:117] "RemoveContainer" containerID="220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f" Apr 22 15:10:24.568384 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:10:24.568366 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f\": container with ID starting with 220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f not found: ID does not exist" containerID="220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f" Apr 22 15:10:24.568431 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.568393 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f"} err="failed to get container status \"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f\": rpc error: code = NotFound desc = could not find container \"220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f\": container with ID starting with 220764d3982f9af4726a477328d26e55a27615366a2be924d2529dfd3b74f16f not found: ID does not exist" Apr 22 15:10:24.579591 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.579570 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:10:24.583588 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:24.583569 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79794f7c48-n6p5w"] Apr 22 15:10:25.056919 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:25.056878 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7908ea56-50d3-4383-8313-2d36f77a6fc4" path="/var/lib/kubelet/pods/7908ea56-50d3-4383-8313-2d36f77a6fc4/volumes" Apr 22 15:10:48.336901 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.336864 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b"] Apr 22 15:10:48.337481 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.337249 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7908ea56-50d3-4383-8313-2d36f77a6fc4" containerName="console" Apr 22 15:10:48.337481 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.337268 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7908ea56-50d3-4383-8313-2d36f77a6fc4" containerName="console" Apr 22 15:10:48.337481 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.337345 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7908ea56-50d3-4383-8313-2d36f77a6fc4" containerName="console" Apr 22 15:10:48.339427 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.339406 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.342125 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.342102 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 15:10:48.343705 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343663 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 15:10:48.343705 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343667 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 15:10:48.343872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343742 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 15:10:48.343872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343743 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 15:10:48.343872 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343862 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 15:10:48.344012 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.343990 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 15:10:48.351905 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.351883 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b"] Apr 22 15:10:48.441038 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.440998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbqq\" (UniqueName: \"kubernetes.io/projected/2829f329-ef3b-4d51-9eb5-4b4769d37de4-kube-api-access-ssbqq\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.441218 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.441072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.441218 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.441108 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.441218 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.441176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.441218 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.441199 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.441371 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.441223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542127 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542127 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542374 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542163 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542374 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542191 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbqq\" (UniqueName: \"kubernetes.io/projected/2829f329-ef3b-4d51-9eb5-4b4769d37de4-kube-api-access-ssbqq\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542374 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.542374 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.542292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.543043 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.543015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.544730 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.544698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-ca\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.544895 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.544871 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.544957 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.544938 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.545013 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.544955 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2829f329-ef3b-4d51-9eb5-4b4769d37de4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.550798 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.550775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbqq\" (UniqueName: \"kubernetes.io/projected/2829f329-ef3b-4d51-9eb5-4b4769d37de4-kube-api-access-ssbqq\") pod \"cluster-proxy-proxy-agent-86c5565795-m7n4b\" (UID: \"2829f329-ef3b-4d51-9eb5-4b4769d37de4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.660768 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.660664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" Apr 22 15:10:48.780121 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:48.780093 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b"] Apr 22 15:10:48.782444 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:10:48.782416 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2829f329_ef3b_4d51_9eb5_4b4769d37de4.slice/crio-d940a7e6c4d93f2fe1261172530b0146804c3b41c909fe98b40f9953e3f0b0f6 WatchSource:0}: Error finding container d940a7e6c4d93f2fe1261172530b0146804c3b41c909fe98b40f9953e3f0b0f6: Status 404 returned error can't find the container with id d940a7e6c4d93f2fe1261172530b0146804c3b41c909fe98b40f9953e3f0b0f6 Apr 22 15:10:49.628442 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:49.628405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" event={"ID":"2829f329-ef3b-4d51-9eb5-4b4769d37de4","Type":"ContainerStarted","Data":"d940a7e6c4d93f2fe1261172530b0146804c3b41c909fe98b40f9953e3f0b0f6"} Apr 22 15:10:52.641340 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:52.641299 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" event={"ID":"2829f329-ef3b-4d51-9eb5-4b4769d37de4","Type":"ContainerStarted","Data":"392ba4d19d751d7a16ddd54394dfe590fd1eccbf006c38178aec6a6b9de9e35f"} Apr 22 15:10:55.651876 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:55.651841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" event={"ID":"2829f329-ef3b-4d51-9eb5-4b4769d37de4","Type":"ContainerStarted","Data":"dc779caccc19de320b75d867e04f154e6b22d1e0f16673f6bcfe3ba18c7db6a5"} Apr 22 15:10:55.651876 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:10:55.651880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" event={"ID":"2829f329-ef3b-4d51-9eb5-4b4769d37de4","Type":"ContainerStarted","Data":"a224dcc89f62459b804c9dd2f8f9be457d3312525d5917b20933a085ee8f4359"} Apr 22 15:11:33.414670 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.414582 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86c5565795-m7n4b" podStartSLOduration=39.282541707 podStartE2EDuration="45.414564219s" podCreationTimestamp="2026-04-22 15:10:48 +0000 UTC" firstStartedPulling="2026-04-22 15:10:48.784118399 +0000 UTC m=+152.321290091" lastFinishedPulling="2026-04-22 15:10:54.916140908 +0000 UTC m=+158.453312603" observedRunningTime="2026-04-22 15:10:55.68477588 +0000 UTC m=+159.221947594" watchObservedRunningTime="2026-04-22 15:11:33.414564219 +0000 UTC m=+196.951735932" Apr 22 15:11:33.415381 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.415360 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t"] Apr 22 15:11:33.417440 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.417423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.429137 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.429110 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:11:33.429306 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.429289 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 15:11:33.430522 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.430502 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-p28p7\"" Apr 22 15:11:33.439153 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.439129 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t"] Apr 22 15:11:33.474743 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.474715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.474870 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.474769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5cs9\" (UniqueName: \"kubernetes.io/projected/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-kube-api-access-r5cs9\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.575761 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.575722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.575942 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.575774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5cs9\" (UniqueName: \"kubernetes.io/projected/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-kube-api-access-r5cs9\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.576145 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.576123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.585162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.585137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5cs9\" (UniqueName: \"kubernetes.io/projected/1f4c81e9-c223-4bcb-982c-17aa2f355c4e-kube-api-access-r5cs9\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-dsj9t\" (UID: \"1f4c81e9-c223-4bcb-982c-17aa2f355c4e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.726721 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.726606 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" Apr 22 15:11:33.852257 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:33.852227 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t"] Apr 22 15:11:33.855556 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:11:33.855522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4c81e9_c223_4bcb_982c_17aa2f355c4e.slice/crio-43b5a7d343718a6704722dbc508001cf869e2222caae3384c76c3aa801f867e8 WatchSource:0}: Error finding container 43b5a7d343718a6704722dbc508001cf869e2222caae3384c76c3aa801f867e8: Status 404 returned error can't find the container with id 43b5a7d343718a6704722dbc508001cf869e2222caae3384c76c3aa801f867e8 Apr 22 15:11:34.755086 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:34.755045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" event={"ID":"1f4c81e9-c223-4bcb-982c-17aa2f355c4e","Type":"ContainerStarted","Data":"43b5a7d343718a6704722dbc508001cf869e2222caae3384c76c3aa801f867e8"} Apr 22 15:11:36.764589 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:36.764559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" event={"ID":"1f4c81e9-c223-4bcb-982c-17aa2f355c4e","Type":"ContainerStarted","Data":"5b99685e8a5b3e9c170bfb9866adf36fc16095e04e7fad1fe059be98ef4f1d75"} Apr 22 15:11:36.785845 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:11:36.785676 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-dsj9t" podStartSLOduration=1.101348427 podStartE2EDuration="3.785661705s" podCreationTimestamp="2026-04-22 15:11:33 +0000 UTC" firstStartedPulling="2026-04-22 15:11:33.85798183 +0000 UTC m=+197.395153523" lastFinishedPulling="2026-04-22 15:11:36.542295097 +0000 UTC m=+200.079466801" observedRunningTime="2026-04-22 15:11:36.784347843 +0000 UTC m=+200.321519569" watchObservedRunningTime="2026-04-22 15:11:36.785661705 +0000 UTC m=+200.322833419" Apr 22 15:12:19.776535 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.776502 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9"] Apr 22 15:12:19.780606 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.780589 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.783618 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.783595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 22 15:12:19.784742 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.784723 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-h4frl\"" Apr 22 15:12:19.784802 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.784735 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 22 15:12:19.784802 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.784751 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:12:19.784886 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.784731 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 22 15:12:19.785165 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.785149 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 22 15:12:19.790664 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.790643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9"] Apr 22 15:12:19.833896 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.833863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-metrics-certs\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.833896 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.833902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-cert\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.834118 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.833960 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/689a481c-b5a3-4754-b88f-b6217bf84093-manager-config\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.834118 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.833981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz8q9\" (UniqueName: \"kubernetes.io/projected/689a481c-b5a3-4754-b88f-b6217bf84093-kube-api-access-xz8q9\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.935149 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.935112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-cert\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.935323 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.935186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/689a481c-b5a3-4754-b88f-b6217bf84093-manager-config\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.935323 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.935219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz8q9\" (UniqueName: \"kubernetes.io/projected/689a481c-b5a3-4754-b88f-b6217bf84093-kube-api-access-xz8q9\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.935323 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.935271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-metrics-certs\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.935956 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.935930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/689a481c-b5a3-4754-b88f-b6217bf84093-manager-config\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.937618 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.937599 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-metrics-certs\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.937692 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.937659 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/689a481c-b5a3-4754-b88f-b6217bf84093-cert\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:19.943361 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:19.943341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz8q9\" (UniqueName: \"kubernetes.io/projected/689a481c-b5a3-4754-b88f-b6217bf84093-kube-api-access-xz8q9\") pod \"jobset-controller-manager-8468c855f5-bdqq9\" (UID: \"689a481c-b5a3-4754-b88f-b6217bf84093\") " pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:20.089924 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:20.089898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:20.204092 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:20.204061 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9"] Apr 22 15:12:20.206673 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:12:20.206644 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689a481c_b5a3_4754_b88f_b6217bf84093.slice/crio-9a43681edcb1a1aae93d3ed0c673fb3f00aaab8afe69ddfa134fd92cf85c54d5 WatchSource:0}: Error finding container 9a43681edcb1a1aae93d3ed0c673fb3f00aaab8afe69ddfa134fd92cf85c54d5: Status 404 returned error can't find the container with id 9a43681edcb1a1aae93d3ed0c673fb3f00aaab8afe69ddfa134fd92cf85c54d5 Apr 22 15:12:20.895371 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:20.895334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" event={"ID":"689a481c-b5a3-4754-b88f-b6217bf84093","Type":"ContainerStarted","Data":"9a43681edcb1a1aae93d3ed0c673fb3f00aaab8afe69ddfa134fd92cf85c54d5"} Apr 22 15:12:22.902310 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:22.902275 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" event={"ID":"689a481c-b5a3-4754-b88f-b6217bf84093","Type":"ContainerStarted","Data":"cabadd1b9259b6f2a1e551ec7e3f8d43094bc05b701ad115ae1138bf95ca16d1"} Apr 22 15:12:22.902722 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:22.902420 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:12:22.923303 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:22.923253 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" podStartSLOduration=2.205639397 podStartE2EDuration="3.923238894s" podCreationTimestamp="2026-04-22 15:12:19 +0000 UTC" firstStartedPulling="2026-04-22 15:12:20.20850338 +0000 UTC m=+243.745675072" lastFinishedPulling="2026-04-22 15:12:21.926102878 +0000 UTC m=+245.463274569" observedRunningTime="2026-04-22 15:12:22.922391448 +0000 UTC m=+246.459563162" watchObservedRunningTime="2026-04-22 15:12:22.923238894 +0000 UTC m=+246.460410607" Apr 22 15:12:33.910929 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:12:33.910845 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-8468c855f5-bdqq9" Apr 22 15:13:16.940827 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:13:16.940793 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:14:44.045923 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.045890 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bd67fbbcf-vrwxk"] Apr 22 15:14:44.049032 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.049010 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.059884 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.059859 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bd67fbbcf-vrwxk"] Apr 22 15:14:44.091771 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-oauth-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.091885 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzlzt\" (UniqueName: \"kubernetes.io/projected/ba78db0f-b49f-458f-8c6f-c4e7857a7716-kube-api-access-dzlzt\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.091885 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.091885 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-service-ca\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.091885 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091848 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-oauth-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.091885 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091873 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.092059 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.091947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-trusted-ca-bundle\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192431 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192537 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-service-ca\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192537 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-oauth-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192537 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192650 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192544 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-trusted-ca-bundle\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192650 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-oauth-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.192650 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.192600 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzlzt\" (UniqueName: \"kubernetes.io/projected/ba78db0f-b49f-458f-8c6f-c4e7857a7716-kube-api-access-dzlzt\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.193202 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.193176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-service-ca\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.193297 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.193179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.193360 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.193341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-trusted-ca-bundle\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.193459 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.193437 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba78db0f-b49f-458f-8c6f-c4e7857a7716-oauth-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.194994 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.194971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-oauth-config\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.195236 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.195214 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba78db0f-b49f-458f-8c6f-c4e7857a7716-console-serving-cert\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.201453 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.201432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzlzt\" (UniqueName: \"kubernetes.io/projected/ba78db0f-b49f-458f-8c6f-c4e7857a7716-kube-api-access-dzlzt\") pod \"console-5bd67fbbcf-vrwxk\" (UID: \"ba78db0f-b49f-458f-8c6f-c4e7857a7716\") " pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.357830 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.357805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:44.472649 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.472626 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bd67fbbcf-vrwxk"] Apr 22 15:14:44.474817 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:14:44.474790 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba78db0f_b49f_458f_8c6f_c4e7857a7716.slice/crio-0e9fff34e9d96f7e97194f904662fd609803a30a7e9cf1a3b00b8dcd21b9ea02 WatchSource:0}: Error finding container 0e9fff34e9d96f7e97194f904662fd609803a30a7e9cf1a3b00b8dcd21b9ea02: Status 404 returned error can't find the container with id 0e9fff34e9d96f7e97194f904662fd609803a30a7e9cf1a3b00b8dcd21b9ea02 Apr 22 15:14:44.476608 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:44.476589 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:14:45.324966 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:45.324933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bd67fbbcf-vrwxk" event={"ID":"ba78db0f-b49f-458f-8c6f-c4e7857a7716","Type":"ContainerStarted","Data":"3c372cd0ac9770417320cdf0776151d490cd0b093c77d460171d51d26b71f7d4"} Apr 22 15:14:45.324966 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:45.324968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bd67fbbcf-vrwxk" event={"ID":"ba78db0f-b49f-458f-8c6f-c4e7857a7716","Type":"ContainerStarted","Data":"0e9fff34e9d96f7e97194f904662fd609803a30a7e9cf1a3b00b8dcd21b9ea02"} Apr 22 15:14:45.343057 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:45.343008 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bd67fbbcf-vrwxk" podStartSLOduration=1.34299633 podStartE2EDuration="1.34299633s" podCreationTimestamp="2026-04-22 15:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:14:45.341070382 +0000 UTC m=+388.878242096" watchObservedRunningTime="2026-04-22 15:14:45.34299633 +0000 UTC m=+388.880168044" Apr 22 15:14:54.358555 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:54.358522 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:54.359047 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:54.358601 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:54.363091 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:54.363065 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:55.358230 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:55.358196 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bd67fbbcf-vrwxk" Apr 22 15:14:55.409128 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:14:55.409093 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:15:20.432346 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.432291 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74b7646c6b-c6g4t" podUID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" containerName="console" containerID="cri-o://e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821" gracePeriod=15 Apr 22 15:15:20.661482 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.661460 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b7646c6b-c6g4t_6cc98e9f-7567-4da1-99f2-2ace8953c61f/console/0.log" Apr 22 15:15:20.661605 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.661522 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:15:20.780280 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780189 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780280 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780276 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780469 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780436 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780525 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780482 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780525 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780505 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5cg\" (UniqueName: \"kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780630 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780542 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780630 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780559 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config\") pod \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\" (UID: \"6cc98e9f-7567-4da1-99f2-2ace8953c61f\") " Apr 22 15:15:20.780630 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780574 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:15:20.780630 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780586 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca" (OuterVolumeSpecName: "service-ca") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:15:20.780880 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780806 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-service-ca\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.780880 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780826 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-oauth-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.780880 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.780863 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:15:20.781039 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.781022 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config" (OuterVolumeSpecName: "console-config") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:15:20.782720 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.782695 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:15:20.782815 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.782770 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg" (OuterVolumeSpecName: "kube-api-access-nq5cg") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "kube-api-access-nq5cg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:15:20.782815 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.782780 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6cc98e9f-7567-4da1-99f2-2ace8953c61f" (UID: "6cc98e9f-7567-4da1-99f2-2ace8953c61f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:15:20.882124 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.882067 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-serving-cert\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.882124 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.882115 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-trusted-ca-bundle\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.882124 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.882128 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nq5cg\" (UniqueName: \"kubernetes.io/projected/6cc98e9f-7567-4da1-99f2-2ace8953c61f-kube-api-access-nq5cg\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.882359 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.882142 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-oauth-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:20.882359 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:20.882155 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6cc98e9f-7567-4da1-99f2-2ace8953c61f-console-config\") on node \"ip-10-0-141-246.ec2.internal\" DevicePath \"\"" Apr 22 15:15:21.429749 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429717 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b7646c6b-c6g4t_6cc98e9f-7567-4da1-99f2-2ace8953c61f/console/0.log" Apr 22 15:15:21.429910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429761 2569 generic.go:358] "Generic (PLEG): container finished" podID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" containerID="e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821" exitCode=2 Apr 22 15:15:21.429910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429848 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b7646c6b-c6g4t" Apr 22 15:15:21.429910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b7646c6b-c6g4t" event={"ID":"6cc98e9f-7567-4da1-99f2-2ace8953c61f","Type":"ContainerDied","Data":"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821"} Apr 22 15:15:21.429910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b7646c6b-c6g4t" event={"ID":"6cc98e9f-7567-4da1-99f2-2ace8953c61f","Type":"ContainerDied","Data":"148259bf113da653e908ca063b8040a0589dd49db06719aa99b73aede751f40c"} Apr 22 15:15:21.429910 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.429899 2569 scope.go:117] "RemoveContainer" containerID="e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821" Apr 22 15:15:21.437366 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.437173 2569 scope.go:117] "RemoveContainer" containerID="e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821" Apr 22 15:15:21.437598 ip-10-0-141-246 kubenswrapper[2569]: E0422 15:15:21.437454 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821\": container with ID starting with e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821 not found: ID does not exist" containerID="e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821" Apr 22 15:15:21.437598 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.437478 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821"} err="failed to get container status \"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821\": rpc error: code = NotFound desc = could not find container \"e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821\": container with ID starting with e2b545ab370739948eda2996a362100505fb731e5671c816ba894ae58ba34821 not found: ID does not exist" Apr 22 15:15:21.446252 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.446226 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:15:21.451756 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:21.451726 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74b7646c6b-c6g4t"] Apr 22 15:15:23.057826 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:15:23.057790 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" path="/var/lib/kubelet/pods/6cc98e9f-7567-4da1-99f2-2ace8953c61f/volumes" Apr 22 15:56:06.755767 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:06.755676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-klhdf_f1ab4beb-2b3d-4198-9103-0dcc4963bbc6/global-pull-secret-syncer/0.log" Apr 22 15:56:06.887086 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:06.887059 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z8m2f_0af3cc76-ae15-4ad4-b096-3b3dacd3e370/konnectivity-agent/0.log" Apr 22 15:56:06.948295 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:06.948265 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-246.ec2.internal_d3eeb1bf222524c1163d0f72b3f74e11/haproxy/0.log" Apr 22 15:56:10.582933 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:10.582902 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59d69cf5d4-b9d5b_e23db7f1-d665-4550-8536-b62b4fc7f499/metrics-server/0.log" Apr 22 15:56:10.609837 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:10.609811 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rwr98_6c91dae2-8148-44b5-ab33-56ca9f634448/monitoring-plugin/0.log" Apr 22 15:56:10.707943 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:10.707912 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rzkn2_74e35d6d-6b97-4484-8bf5-c855372fc51b/node-exporter/0.log" Apr 22 15:56:10.746627 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:10.746604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rzkn2_74e35d6d-6b97-4484-8bf5-c855372fc51b/kube-rbac-proxy/0.log" Apr 22 15:56:10.766240 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:10.766215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rzkn2_74e35d6d-6b97-4484-8bf5-c855372fc51b/init-textfile/0.log" Apr 22 15:56:11.089074 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.089044 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v2crv_07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29/prometheus-operator/0.log" Apr 22 15:56:11.107634 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.107609 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v2crv_07e8ba1b-86c7-46e8-a1d0-8821e0eb4c29/kube-rbac-proxy/0.log" Apr 22 15:56:11.236571 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.236545 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/thanos-query/0.log" Apr 22 15:56:11.258414 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.258389 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/kube-rbac-proxy-web/0.log" Apr 22 15:56:11.281668 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.281650 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/kube-rbac-proxy/0.log" Apr 22 15:56:11.299658 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.299640 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/prom-label-proxy/0.log" Apr 22 15:56:11.318221 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.318201 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/kube-rbac-proxy-rules/0.log" Apr 22 15:56:11.338162 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:11.338141 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-687bddbc7b-vr52v_1938607c-447d-4b0d-996a-713970b3d71a/kube-rbac-proxy-metrics/0.log" Apr 22 15:56:13.300070 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:13.300043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bd67fbbcf-vrwxk_ba78db0f-b49f-458f-8c6f-c4e7857a7716/console/0.log" Apr 22 15:56:14.335624 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:14.335601 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-76xbb_31b91dc6-451c-4694-8db8-6ef3dcecbf4c/dns/0.log" Apr 22 15:56:14.354561 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:14.354538 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-76xbb_31b91dc6-451c-4694-8db8-6ef3dcecbf4c/kube-rbac-proxy/0.log" Apr 22 15:56:14.420858 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:14.420832 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2bfnk_543bfa0b-0351-4992-892b-fe4be8b7eb4c/dns-node-resolver/0.log" Apr 22 15:56:14.840789 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:14.840760 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5cdb9c7cd9-75b2w_9de5e468-cf8f-4904-9728-9f97bf669789/registry/0.log" Apr 22 15:56:14.883328 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:14.883301 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9tmhk_c25715bc-fd04-4b79-b6da-892713f85b6c/node-ca/0.log" Apr 22 15:56:15.106366 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.106291 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8"] Apr 22 15:56:15.106582 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.106555 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" containerName="console" Apr 22 15:56:15.106582 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.106568 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" containerName="console" Apr 22 15:56:15.106747 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.106621 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cc98e9f-7567-4da1-99f2-2ace8953c61f" containerName="console" Apr 22 15:56:15.109632 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.109612 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.112027 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.112007 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xjqzk\"/\"kube-root-ca.crt\"" Apr 22 15:56:15.112138 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.112023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xjqzk\"/\"default-dockercfg-ct867\"" Apr 22 15:56:15.113140 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.113120 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xjqzk\"/\"openshift-service-ca.crt\"" Apr 22 15:56:15.115538 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.115519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8"] Apr 22 15:56:15.233521 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.233491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-sys\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.233713 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.233543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-lib-modules\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.233713 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.233601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-podres\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.233713 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.233656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrlq\" (UniqueName: \"kubernetes.io/projected/75d9e103-34ee-4465-b987-c19ca9592015-kube-api-access-fcrlq\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.233713 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.233701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-proc\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334410 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrlq\" (UniqueName: \"kubernetes.io/projected/75d9e103-34ee-4465-b987-c19ca9592015-kube-api-access-fcrlq\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334410 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-proc\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-sys\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-lib-modules\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-podres\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-sys\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-lib-modules\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-proc\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.334643 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.334632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75d9e103-34ee-4465-b987-c19ca9592015-podres\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.342246 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.342224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrlq\" (UniqueName: \"kubernetes.io/projected/75d9e103-34ee-4465-b987-c19ca9592015-kube-api-access-fcrlq\") pod \"perf-node-gather-daemonset-mshb8\" (UID: \"75d9e103-34ee-4465-b987-c19ca9592015\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.420472 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.420391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.531911 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.531888 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8"] Apr 22 15:56:15.534334 ip-10-0-141-246 kubenswrapper[2569]: W0422 15:56:15.534293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod75d9e103_34ee_4465_b987_c19ca9592015.slice/crio-1d7e97e9c8fb1fdc5ec5831421cec29fcc2295a89cd3dca40289ff015fdfc13b WatchSource:0}: Error finding container 1d7e97e9c8fb1fdc5ec5831421cec29fcc2295a89cd3dca40289ff015fdfc13b: Status 404 returned error can't find the container with id 1d7e97e9c8fb1fdc5ec5831421cec29fcc2295a89cd3dca40289ff015fdfc13b Apr 22 15:56:15.536128 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.536109 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:56:15.693168 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.693097 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" event={"ID":"75d9e103-34ee-4465-b987-c19ca9592015","Type":"ContainerStarted","Data":"19d915ab6ebd26fc26d38060cb0949c3993b0909be295ae0b18354df2055600d"} Apr 22 15:56:15.693168 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.693133 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" event={"ID":"75d9e103-34ee-4465-b987-c19ca9592015","Type":"ContainerStarted","Data":"1d7e97e9c8fb1fdc5ec5831421cec29fcc2295a89cd3dca40289ff015fdfc13b"} Apr 22 15:56:15.693344 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.693206 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:15.972600 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:15.972520 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nrz6v_f8e31d3f-d753-47a0-b9ff-ad56f2deb44a/serve-healthcheck-canary/0.log" Apr 22 15:56:16.327917 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:16.327847 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hll6j_4f36f114-7050-456f-a137-b827c10f5102/kube-rbac-proxy/0.log" Apr 22 15:56:16.348018 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:16.347999 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hll6j_4f36f114-7050-456f-a137-b827c10f5102/exporter/0.log" Apr 22 15:56:16.367737 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:16.367716 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hll6j_4f36f114-7050-456f-a137-b827c10f5102/extractor/0.log" Apr 22 15:56:18.108419 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:18.108391 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-8468c855f5-bdqq9_689a481c-b5a3-4754-b88f-b6217bf84093/manager/0.log" Apr 22 15:56:21.706123 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:21.706093 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" Apr 22 15:56:21.719769 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:21.719704 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-mshb8" podStartSLOduration=6.719670219 podStartE2EDuration="6.719670219s" podCreationTimestamp="2026-04-22 15:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:56:15.70673506 +0000 UTC m=+2879.243906773" watchObservedRunningTime="2026-04-22 15:56:21.719670219 +0000 UTC m=+2885.256841932" Apr 22 15:56:22.454079 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.454049 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/kube-multus-additional-cni-plugins/0.log" Apr 22 15:56:22.474218 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.474193 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/egress-router-binary-copy/0.log" Apr 22 15:56:22.492775 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.492750 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/cni-plugins/0.log" Apr 22 15:56:22.510930 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.510911 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/bond-cni-plugin/0.log" Apr 22 15:56:22.531118 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.531097 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/routeoverride-cni/0.log" Apr 22 15:56:22.549785 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.549762 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/whereabouts-cni-bincopy/0.log" Apr 22 15:56:22.569575 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.569556 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qpxv8_49c61d54-9288-4695-8a61-293644b9038e/whereabouts-cni/0.log" Apr 22 15:56:22.737493 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.737427 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dgt4k_b42b90f9-6d5b-4342-979c-184c4620abb7/kube-multus/0.log" Apr 22 15:56:22.843113 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.843083 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hzm72_9d868504-055f-463c-b932-801175d669c7/network-metrics-daemon/0.log" Apr 22 15:56:22.860959 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:22.860930 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hzm72_9d868504-055f-463c-b932-801175d669c7/kube-rbac-proxy/0.log" Apr 22 15:56:24.011333 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.011295 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/ovn-controller/0.log" Apr 22 15:56:24.043977 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.043908 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/ovn-acl-logging/0.log" Apr 22 15:56:24.059782 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.059759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/kube-rbac-proxy-node/0.log" Apr 22 15:56:24.096441 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.096413 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:56:24.121426 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.121409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/northd/0.log" Apr 22 15:56:24.147176 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.147159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/nbdb/0.log" Apr 22 15:56:24.171515 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.171496 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/sbdb/0.log" Apr 22 15:56:24.270981 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:24.270949 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cq6md_a686392c-d33c-438a-ba47-40397c16e97c/ovnkube-controller/0.log" Apr 22 15:56:25.722013 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:25.721986 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wfpb4_987b673f-d105-40ae-8ec9-b8dab14f068f/network-check-target-container/0.log" Apr 22 15:56:26.568573 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:26.568488 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5qlng_a59df43f-31be-4ae9-be3b-b10d9d017c59/iptables-alerter/0.log" Apr 22 15:56:27.189770 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:27.189745 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ckl4r_e6729570-df4a-4419-b11f-1b1f782967bd/tuned/0.log" Apr 22 15:56:30.505894 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:30.505870 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pvchr_d9dcdab4-58d9-45fa-a9c1-d04589ab8abe/csi-driver/0.log" Apr 22 15:56:30.524235 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:30.524206 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pvchr_d9dcdab4-58d9-45fa-a9c1-d04589ab8abe/csi-node-driver-registrar/0.log" Apr 22 15:56:30.546902 ip-10-0-141-246 kubenswrapper[2569]: I0422 15:56:30.546879 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-pvchr_d9dcdab4-58d9-45fa-a9c1-d04589ab8abe/csi-liveness-probe/0.log"