Apr 24 21:24:50.771205 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:50.771216 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:50.771225 ip-10-0-131-55 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:50.771529 ip-10-0-131-55 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:00.904334 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:00.904353 ip-10-0-131-55 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 59ac583034c04182b66899fd4c27be87 -- Apr 24 21:27:24.703230 ip-10-0-131-55 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:25.127202 ip-10-0-131-55 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:25.127202 ip-10-0-131-55 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:25.127202 ip-10-0-131-55 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:25.127202 ip-10-0-131-55 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:25.127202 ip-10-0-131-55 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:25.130027 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.129924 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:25.132155 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132141 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:25.132155 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132155 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132159 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132163 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132166 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132168 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132173 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132176 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132179 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132182 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132184 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132187 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132190 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132192 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132195 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132199 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132201 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132204 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132207 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132209 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132212 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:25.132218 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132214 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132217 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132220 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132222 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132226 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132229 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132231 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132235 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132237 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132240 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132242 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132246 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132251 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132254 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132257 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132259 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132262 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132265 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132269 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:25.132696 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132272 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132275 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132278 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132280 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132283 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132285 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132288 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132290 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132293 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132295 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132298 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132300 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132303 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132305 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132308 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132311 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132314 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132316 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132319 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:25.133185 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132322 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132324 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132327 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132329 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132332 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132335 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132337 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132340 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132343 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132345 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132348 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132350 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132352 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132356 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132359 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132361 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132364 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132366 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132368 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132371 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:25.133657 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132374 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132376 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132379 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132381 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132384 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132386 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132389 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132762 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132768 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132771 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132775 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132777 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132780 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132783 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132785 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132788 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132790 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132793 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132796 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:25.134153 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132804 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132807 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132810 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132812 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132815 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132817 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132820 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132822 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132825 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132828 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132831 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132833 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132836 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132838 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132841 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132843 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132846 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132848 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132851 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:25.134606 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132856 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132859 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132863 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132866 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132870 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132873 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132875 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132878 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132880 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132883 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132886 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132889 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132892 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132896 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132899 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132902 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132904 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132907 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132909 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132912 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:25.135124 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132914 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132916 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132919 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132921 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132924 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132927 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132930 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132932 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132935 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132937 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132940 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132943 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132946 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132949 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132951 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132968 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132971 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132973 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132976 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:25.135609 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132979 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132982 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132985 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132987 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132990 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132993 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132996 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.132998 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133001 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133004 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133006 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133009 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133012 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133015 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133017 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.133020 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133658 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133668 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133678 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133682 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133687 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:25.136089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133690 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133694 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133699 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133702 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133705 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133709 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133713 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133716 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133719 2566 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133721 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133724 2566 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133727 2566 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133730 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133733 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133738 2566 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133741 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133744 2566 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133747 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133750 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133754 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133758 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133761 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133765 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133768 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:25.136591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133775 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133778 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133781 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133785 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133792 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133795 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133798 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133801 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133803 2566 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133807 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133810 2566 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133813 2566 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133817 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133820 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133823 2566 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133827 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133830 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133833 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133836 2566 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133839 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133842 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133845 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133848 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133851 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133853 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:25.137196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133856 2566 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133860 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133863 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133866 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133876 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133880 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133883 2566 flags.go:64] FLAG: --help="false" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133887 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133890 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133893 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133896 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133900 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133903 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133906 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133909 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133912 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133915 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133918 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133921 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133924 2566 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133927 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133930 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133933 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133936 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:25.137807 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133938 2566 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133941 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133944 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133947 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133966 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133969 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133972 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133975 2566 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133978 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133981 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133984 2566 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133987 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.133991 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134000 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134004 2566 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134009 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134012 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134015 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134018 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134021 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134024 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134027 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134034 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134037 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134040 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:25.138391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134043 2566 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134046 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134053 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134056 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134059 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134062 2566 flags.go:64] FLAG: --port="10250" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134066 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134072 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01365d5fd1373549b" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134076 2566 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134079 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134083 2566 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134085 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134088 2566 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134092 2566 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134095 2566 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134098 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134100 2566 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134104 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134107 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134110 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134113 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134121 2566 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134124 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134129 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134132 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:25.139013 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134135 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134137 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134140 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134143 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134146 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134149 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134152 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134155 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134158 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134162 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134165 2566 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134168 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134174 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134177 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134179 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134850 2566 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134853 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134857 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134860 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134863 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134867 2566 flags.go:64] FLAG: --v="2" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134871 2566 flags.go:64] FLAG: --version="false" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134875 2566 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134880 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.134883 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:25.139642 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135000 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135004 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135007 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135010 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135013 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135017 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135020 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135023 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135026 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135030 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135034 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135037 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135040 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135042 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135045 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135048 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135051 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135054 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135056 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:25.140248 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135059 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135061 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135064 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135067 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135069 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135072 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135075 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135077 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135080 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135082 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135085 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135087 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135090 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135092 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135095 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135097 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135100 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135102 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135108 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:25.140722 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135112 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135116 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135119 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135121 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135124 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135127 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135130 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135132 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135136 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135138 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135141 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135144 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135146 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135149 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135152 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135154 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135157 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135160 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135162 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135165 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:25.141589 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135167 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135170 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135172 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135175 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135177 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135180 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135182 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135185 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135187 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135190 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135192 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135195 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135198 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135201 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135204 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135206 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135209 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135211 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135214 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135216 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:25.142459 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135219 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135222 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135225 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135227 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135230 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135232 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135235 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.135237 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.135245 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.142855 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:25.143344 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.143311 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143393 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143401 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143407 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143413 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143418 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143423 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143428 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143433 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143437 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143441 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143447 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143452 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143456 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143461 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143465 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143469 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143473 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143478 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:25.143789 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143482 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143486 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143490 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143495 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143501 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143508 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143512 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143517 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143521 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143526 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143531 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143536 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143540 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143545 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143550 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143555 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143559 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143566 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143572 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:25.144608 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143577 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143581 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143585 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143590 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143594 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143598 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143602 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143606 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143611 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143615 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143619 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143624 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143628 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143632 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143636 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143640 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143644 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143648 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143652 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143656 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:25.145280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143661 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143665 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143670 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143674 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143678 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143683 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143687 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143692 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143697 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143702 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143706 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143710 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143715 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143719 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143723 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143728 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143732 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143736 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143740 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143744 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:25.145824 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143748 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143753 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143757 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143762 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143766 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143770 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143774 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143778 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143782 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.143790 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143946 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143970 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143976 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143981 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143986 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:25.146652 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143990 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143994 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.143998 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144004 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144008 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144014 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144018 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144022 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144026 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144030 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144034 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144038 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144042 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144046 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144050 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144054 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144058 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144062 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144067 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144071 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:25.147280 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144075 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144079 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144083 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144087 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144091 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144095 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144100 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144104 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144108 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144112 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144116 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144121 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144125 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144129 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144133 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144137 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144142 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144146 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144151 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144155 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:25.147845 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144158 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144162 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144166 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144169 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144173 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144177 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144181 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144185 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144188 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144192 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144195 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144199 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144203 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144207 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144212 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144216 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144220 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144224 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144228 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144232 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:25.148408 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144236 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144240 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144244 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144248 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144253 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144257 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144261 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144268 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144274 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144278 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144283 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144289 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144294 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144298 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144302 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144306 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144310 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144314 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144318 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144322 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:25.148940 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:25.144327 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.144335 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.145054 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.147734 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.148700 2566 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.148795 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:25.149486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.148835 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:25.172051 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.172028 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:25.175167 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.175137 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:25.188628 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.188609 2566 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:25.194106 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.194090 2566 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:25.195369 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.195348 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:25.202051 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.202025 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:25.203228 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.203205 2566 fs.go:135] Filesystem UUIDs: map[4c4e684d-c449-4aec-9382-aa9f00539cc3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d1084e66-aa8a-4fe2-9977-0ef4925f4a49:/dev/nvme0n1p4] Apr 24 21:27:25.203272 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.203228 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:25.209028 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.208900 2566 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:25.206859452 +0000 UTC m=+0.392147782 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095894 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b9b4be8cfcf630a4eb49a087412c9 SystemUUID:ec2b9b4b-e8cf-cf63-0a4e-b49a087412c9 BootID:59ac5830-34c0-4182-b668-99fd4c27be87 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7c:f4:ae:71:37 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7c:f4:ae:71:37 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:8b:36:20:16:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:25.209028 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.209022 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:25.209147 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.209095 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:25.210756 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.210732 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:25.210884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.210758 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:25.210926 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.210893 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:25.210926 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.210902 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:25.210926 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.210915 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:25.211622 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.211612 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:25.212444 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.212435 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:25.212548 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.212540 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:25.215477 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.215467 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:25.215512 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.215481 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:25.215512 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.215497 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:25.215512 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.215507 2566 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:25.215621 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.215517 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:25.216565 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.216550 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:25.216565 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.216568 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:25.219714 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.219691 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:25.221100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.221087 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:25.224637 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.224613 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpfq6" Apr 24 21:27:25.225341 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225323 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:25.225442 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225432 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:25.225523 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225515 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:25.225586 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225579 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:25.225647 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225640 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:25.225720 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225713 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:25.225781 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225774 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:25.225838 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225832 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:25.225903 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225895 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:25.225981 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.225972 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:25.226066 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.226058 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:25.226154 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.226142 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:25.227404 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.227352 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:25.227468 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.227412 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:25.227747 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.227722 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dpfq6" Apr 24 21:27:25.229351 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.229329 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:25.229351 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.229335 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:25.231667 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.231652 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:25.231722 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.231696 2566 server.go:1295] "Started kubelet" Apr 24 21:27:25.231818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.231777 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:25.231879 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.231836 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:25.231912 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.231903 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:25.232485 ip-10-0-131-55 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:25.233454 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.233430 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:25.234201 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.234184 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:25.237712 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.237695 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:25.237712 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.237707 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:25.238451 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.238326 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:25.238451 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.238354 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:25.238451 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.238446 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.238603 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.238451 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:25.238603 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.238495 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:25.238603 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.238503 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:25.239849 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.239745 2566 factory.go:55] Registering systemd factory Apr 24 21:27:25.239849 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.239770 2566 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:25.240007 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240000 2566 factory.go:153] Registering CRI-O factory Apr 24 21:27:25.240052 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240012 2566 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:25.240094 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240088 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:25.240193 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240156 2566 factory.go:103] Registering Raw factory Apr 24 21:27:25.240193 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240173 2566 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:25.240574 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.240556 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.241332 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.241166 2566 manager.go:319] Starting recovery of all containers Apr 24 21:27:25.249358 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.249339 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-55.ec2.internal" not found Apr 24 21:27:25.249455 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.249392 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-55.ec2.internal\" not found" node="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.249777 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.249762 2566 manager.go:324] Recovery completed Apr 24 21:27:25.256847 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.256833 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.259995 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.259982 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.260056 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.260008 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.260056 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.260021 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.260505 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.260492 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:25.260562 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.260504 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:25.260562 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.260525 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:25.262720 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.262708 2566 policy_none.go:49] "None policy: Start" Apr 24 21:27:25.262774 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.262724 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:25.262774 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.262734 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:25.267690 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.267671 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-55.ec2.internal" not found Apr 24 21:27:25.297564 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.297550 2566 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.297578 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.297588 2566 server.go:85] "Starting device plugin registration server" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.297805 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.297819 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.297928 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.298020 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.298029 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.298434 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:25.311559 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.298471 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.330175 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.330155 2566 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-55.ec2.internal" not found Apr 24 21:27:25.374191 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.374164 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:25.375249 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.375231 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:25.375315 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.375253 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:25.375315 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.375269 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:25.375315 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.375276 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:25.375315 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.375304 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:25.378568 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.378521 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:25.398544 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.398520 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.399379 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.399362 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.399463 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.399392 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.399463 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.399402 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.399463 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.399431 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.413622 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.413606 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.413707 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.413629 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-55.ec2.internal\": node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.436302 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.436282 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.475380 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.475356 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal"] Apr 24 21:27:25.475437 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.475420 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.476195 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.476178 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.476269 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.476209 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.476269 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.476221 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.478276 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.478264 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.478916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.478902 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.478978 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.478927 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.478978 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.478936 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.480862 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.480848 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.480906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.480879 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.480942 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.480849 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.481005 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.480952 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:25.481560 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481540 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.481633 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481570 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.481633 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481581 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.481633 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481540 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:25.481732 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481642 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:25.481732 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.481674 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:25.508455 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.508438 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-55.ec2.internal\" not found" node="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.512673 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.512658 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-55.ec2.internal\" not found" node="ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.536547 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.536529 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.539831 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.539817 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.539884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.539840 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.539884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.539857 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.637115 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.637033 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.640321 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640301 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.640378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.640378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640348 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.640452 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.640452 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.640522 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.640409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.737819 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.737786 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.810204 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.810168 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.814625 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:25.814605 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:25.838347 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.838328 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:25.938880 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:25.938787 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.039181 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.039138 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.104058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.104027 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:26.139280 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.139254 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.148481 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.148467 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:26.148611 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.148582 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:26.148693 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.148625 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:26.148693 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.148622 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:26.230095 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.230060 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:25 +0000 UTC" deadline="2027-10-23 01:28:59.756830465 +0000 UTC" Apr 24 21:27:26.230095 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.230093 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13108h1m33.526740886s" Apr 24 21:27:26.238233 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.238212 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:26.239375 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.239358 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.255744 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.255724 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:26.285603 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.285578 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8vml6" Apr 24 21:27:26.295186 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.295165 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8vml6" Apr 24 21:27:26.340175 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.340150 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.345613 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:26.345575 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfc4ac8bf9ace5a036e90036a3f5792.slice/crio-6ce8546fea4dc893bb562fd3492f70c6bf22b0044c46ece7cb69dd47b7bdd22c WatchSource:0}: Error finding container 6ce8546fea4dc893bb562fd3492f70c6bf22b0044c46ece7cb69dd47b7bdd22c: Status 404 returned error can't find the container with id 6ce8546fea4dc893bb562fd3492f70c6bf22b0044c46ece7cb69dd47b7bdd22c Apr 24 21:27:26.345795 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:26.345780 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec74c72a1ae3da2b3b1eef59bb72e15d.slice/crio-04aa55332a7d2c3f1c909f28af655874c475ac7bd7079c1de0fecaaa3451a0ff WatchSource:0}: Error finding container 04aa55332a7d2c3f1c909f28af655874c475ac7bd7079c1de0fecaaa3451a0ff: Status 404 returned error can't find the container with id 04aa55332a7d2c3f1c909f28af655874c475ac7bd7079c1de0fecaaa3451a0ff Apr 24 21:27:26.349936 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.349920 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:26.377812 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.377772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerStarted","Data":"6ce8546fea4dc893bb562fd3492f70c6bf22b0044c46ece7cb69dd47b7bdd22c"} Apr 24 21:27:26.378664 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.378641 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" event={"ID":"ec74c72a1ae3da2b3b1eef59bb72e15d","Type":"ContainerStarted","Data":"04aa55332a7d2c3f1c909f28af655874c475ac7bd7079c1de0fecaaa3451a0ff"} Apr 24 21:27:26.440995 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.440966 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.541505 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.541435 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.641971 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:26.641914 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 24 21:27:26.670203 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.670170 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:26.739088 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.739054 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 24 21:27:26.749458 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.749434 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:26.751715 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.751681 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 24 21:27:26.759981 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:26.759943 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:27.216227 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.216196 2566 apiserver.go:52] "Watching apiserver" Apr 24 21:27:27.222688 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.222655 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:27.223062 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.223030 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-kwjx6","openshift-ovn-kubernetes/ovnkube-node-7kvjv","kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x","openshift-cluster-node-tuning-operator/tuned-6nn9n","openshift-image-registry/node-ca-7pgjs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal","openshift-network-operator/iptables-alerter-cqnxb","kube-system/konnectivity-agent-vv85h","openshift-dns/node-resolver-9lr5f","openshift-multus/multus-additional-cni-plugins-64swc","openshift-multus/multus-sslzj","openshift-multus/network-metrics-daemon-5b7z5"] Apr 24 21:27:27.225614 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.225594 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.227916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.227605 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.227916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.227747 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.227916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.227841 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wkwzl\"" Apr 24 21:27:27.230131 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.230113 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.230234 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.230120 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.232019 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.231998 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:27.232140 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232122 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:27.232283 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232206 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5jvrh\"" Apr 24 21:27:27.232393 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232356 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.232393 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232371 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.232558 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232410 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:27.232558 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232437 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.232702 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232653 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.232702 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232713 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-m4n82\"" Apr 24 21:27:27.232828 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232783 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:27.232900 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232865 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.232972 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.232922 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:27.234519 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.234501 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.234602 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.234539 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.234602 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.234554 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v5lqg\"" Apr 24 21:27:27.234916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.234899 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.236304 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.236284 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.236442 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.236416 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zq2fn\"" Apr 24 21:27:27.236613 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.236595 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:27.236690 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.236676 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.237169 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.237153 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.238721 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.238698 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.238835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.238738 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2ld2q\"" Apr 24 21:27:27.239029 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.238951 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:27.239113 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.239021 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.239524 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.239504 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.241060 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.241048 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-m5pdd\"" Apr 24 21:27:27.241264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.241128 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:27.241350 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.241325 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:27.241813 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.241798 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:27.241895 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.241870 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:27.244263 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.244244 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.246189 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246171 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:27.246289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246189 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtq92\"" Apr 24 21:27:27.246289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246281 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:27.246398 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246301 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:27.246398 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246191 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:27.246491 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246406 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:27.246677 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.246648 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.247050 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247031 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-run\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247138 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247060 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-host\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247138 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247077 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-systemd-units\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.247138 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.247138 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247122 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cj5j\" (UniqueName: \"kubernetes.io/projected/e5028c4f-ef6b-4051-a2c3-1def0a14889f-kube-api-access-6cj5j\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247166 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrf2\" (UniqueName: \"kubernetes.io/projected/5be049ef-a2de-4653-8547-eaa092ea4f87-kube-api-access-bnrf2\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247215 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpvgt\" (UniqueName: \"kubernetes.io/projected/34563d27-4e33-4e2d-bf32-c118f5855139-kube-api-access-wpvgt\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8btr\" (UniqueName: \"kubernetes.io/projected/6fcca08a-b4ba-4f45-862a-1e503776cfe8-kube-api-access-c8btr\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247271 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/719c9974-f956-4125-bc24-da51ad2c4d61-agent-certs\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247295 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-slash\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-bin\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.247325 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247342 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-sys-fs\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.247736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-modprobe-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysconfig\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-conf\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.247736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.247736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.247548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-lib-modules\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.248079 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-var-lib-kubelet\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.248139 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-log-socket\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248139 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248125 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30105400-c97c-4cc8-ac91-9f6cfe32780b-hosts-file\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.248261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248150 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30105400-c97c-4cc8-ac91-9f6cfe32780b-tmp-dir\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.248261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248171 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fcca08a-b4ba-4f45-862a-1e503776cfe8-host-slash\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.248261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248189 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-kubelet\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248218 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-registration-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.248261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.248489 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d8e7b20-2410-4675-a443-408c37cdef11-host\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.248489 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-var-lib-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248489 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248341 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248489 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248369 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-netd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248836 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248655 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-env-overrides\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.248907 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248862 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-socket-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.248907 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.249020 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-sys\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.249020 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.248946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlmc9\" (UniqueName: \"kubernetes.io/projected/6d8e7b20-2410-4675-a443-408c37cdef11-kube-api-access-tlmc9\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.249157 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249140 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249216 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovn-node-metrics-cert\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249332 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-script-lib\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249424 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249403 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-etc-tuned\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.249487 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249439 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-tmp\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.249487 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249473 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d8e7b20-2410-4675-a443-408c37cdef11-serviceca\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.249688 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249667 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qtwk\" (UniqueName: \"kubernetes.io/projected/30105400-c97c-4cc8-ac91-9f6cfe32780b-kube-api-access-6qtwk\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.249772 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6fcca08a-b4ba-4f45-862a-1e503776cfe8-iptables-alerter-script\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.249772 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249759 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-netns\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249892 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-etc-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249990 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249927 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-config\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.249990 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.249985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-systemd\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.250107 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250013 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/719c9974-f956-4125-bc24-da51ad2c4d61-konnectivity-ca\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.250107 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250032 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.250107 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-systemd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.250107 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250081 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-ovn\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.250326 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250109 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-node-log\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.250326 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.250111 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:27.250326 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250144 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-device-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.250326 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250175 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-kubernetes\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.250551 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250468 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:27.250551 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.250527 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-57sgt\"" Apr 24 21:27:27.296238 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.296210 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:26 +0000 UTC" deadline="2027-10-13 08:07:21.653843625 +0000 UTC" Apr 24 21:27:27.296238 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.296235 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12874h39m54.357610702s" Apr 24 21:27:27.328888 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.328860 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:27.339535 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.339506 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:27.350719 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350691 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-systemd\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.350818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350714 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-systemd\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.350818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350737 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-etc-kubernetes\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.350818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350775 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlkv6\" (UniqueName: \"kubernetes.io/projected/abed56c4-528e-496e-b85c-a6fe11c4f6e3-kube-api-access-nlkv6\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.350818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350794 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-systemd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.350818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350810 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-node-log\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-systemd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-kubernetes\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350874 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-node-log\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpvgt\" (UniqueName: \"kubernetes.io/projected/34563d27-4e33-4e2d-bf32-c118f5855139-kube-api-access-wpvgt\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350941 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-kubernetes\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.350946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.351065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351014 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279xl\" (UniqueName: \"kubernetes.io/projected/9f38edd3-fb52-42bc-b164-d84e78cffcc0-kube-api-access-279xl\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351134 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cnibin\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-slash\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-modprobe-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysconfig\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351291 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-slash\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-conf\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351342 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysconfig\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-lib-modules\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30105400-c97c-4cc8-ac91-9f6cfe32780b-hosts-file\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351379 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-modprobe-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351421 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fcca08a-b4ba-4f45-862a-1e503776cfe8-host-slash\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-registration-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351442 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-conf\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351446 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/30105400-c97c-4cc8-ac91-9f6cfe32780b-hosts-file\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351459 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351483 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d8e7b20-2410-4675-a443-408c37cdef11-host\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351497 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-registration-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-lib-modules\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351509 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gvl\" (UniqueName: \"kubernetes.io/projected/c4524175-6c2f-4026-ac93-751748e5a1c4-kube-api-access-b5gvl\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351530 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6fcca08a-b4ba-4f45-862a-1e503776cfe8-host-slash\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351581 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-var-lib-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-netd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-etc-selinux\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlmc9\" (UniqueName: \"kubernetes.io/projected/6d8e7b20-2410-4675-a443-408c37cdef11-kube-api-access-tlmc9\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d8e7b20-2410-4675-a443-408c37cdef11-host\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351648 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-var-lib-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.351906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351700 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-netd\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-system-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-multus\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-hostroot\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351833 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-os-release\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovn-node-metrics-cert\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-script-lib\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351918 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-etc-tuned\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351947 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-cnibin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.351990 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-conf-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352019 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-multus-certs\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352044 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qtwk\" (UniqueName: \"kubernetes.io/projected/30105400-c97c-4cc8-ac91-9f6cfe32780b-kube-api-access-6qtwk\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-config\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352141 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-run\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352167 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-netns\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.352654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352217 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352200 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-run\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352251 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/719c9974-f956-4125-bc24-da51ad2c4d61-konnectivity-ca\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352300 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-ovn\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-device-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-host\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352388 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-systemd-units\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352443 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cj5j\" (UniqueName: \"kubernetes.io/projected/e5028c4f-ef6b-4051-a2c3-1def0a14889f-kube-api-access-6cj5j\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrf2\" (UniqueName: \"kubernetes.io/projected/5be049ef-a2de-4653-8547-eaa092ea4f87-kube-api-access-bnrf2\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-socket-dir-parent\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-k8s-cni-cncf-io\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352614 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8btr\" (UniqueName: \"kubernetes.io/projected/6fcca08a-b4ba-4f45-862a-1e503776cfe8-kube-api-access-c8btr\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/719c9974-f956-4125-bc24-da51ad2c4d61-agent-certs\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.353449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352723 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-bin\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-systemd-units\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-sys-fs\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352770 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-var-lib-kubelet\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352774 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352792 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-system-cni-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/719c9974-f956-4125-bc24-da51ad2c4d61-konnectivity-ca\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352821 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-config\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352862 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352920 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-host\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-os-release\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-log-socket\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-ovn\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.352998 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30105400-c97c-4cc8-ac91-9f6cfe32780b-tmp-dir\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353012 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-log-socket\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-var-lib-kubelet\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.354264 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353043 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-kubelet\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353072 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-kubelet\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353081 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-kubelet\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353091 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-sys-fs\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353108 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353133 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-env-overrides\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353134 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-cni-bin\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353173 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-ovn-kubernetes\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353198 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-socket-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353222 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-sys\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353269 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovnkube-script-lib\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353288 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-bin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-daemon-config\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353346 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353371 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-tmp\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d8e7b20-2410-4675-a443-408c37cdef11-serviceca\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.355073 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353397 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-etc-sysctl-d\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353410 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/30105400-c97c-4cc8-ac91-9f6cfe32780b-tmp-dir\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353421 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-cni-binary-copy\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6fcca08a-b4ba-4f45-862a-1e503776cfe8-iptables-alerter-script\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-run-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353497 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-socket-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353533 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5be049ef-a2de-4653-8547-eaa092ea4f87-device-dir\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353559 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34563d27-4e33-4e2d-bf32-c118f5855139-sys\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353579 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-netns\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353602 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5028c4f-ef6b-4051-a2c3-1def0a14889f-env-overrides\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353608 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-etc-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353638 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-etc-openvswitch\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353684 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5028c4f-ef6b-4051-a2c3-1def0a14889f-host-run-netns\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.353973 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6d8e7b20-2410-4675-a443-408c37cdef11-serviceca\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.354455 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6fcca08a-b4ba-4f45-862a-1e503776cfe8-iptables-alerter-script\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.355746 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5028c4f-ef6b-4051-a2c3-1def0a14889f-ovn-node-metrics-cert\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.355822 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.355810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/719c9974-f956-4125-bc24-da51ad2c4d61-agent-certs\") pod \"konnectivity-agent-vv85h\" (UID: \"719c9974-f956-4125-bc24-da51ad2c4d61\") " pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.356585 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.356128 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-tmp\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.356697 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.356680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34563d27-4e33-4e2d-bf32-c118f5855139-etc-tuned\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.360636 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.360614 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qtwk\" (UniqueName: \"kubernetes.io/projected/30105400-c97c-4cc8-ac91-9f6cfe32780b-kube-api-access-6qtwk\") pod \"node-resolver-9lr5f\" (UID: \"30105400-c97c-4cc8-ac91-9f6cfe32780b\") " pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.360728 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.360696 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpvgt\" (UniqueName: \"kubernetes.io/projected/34563d27-4e33-4e2d-bf32-c118f5855139-kube-api-access-wpvgt\") pod \"tuned-6nn9n\" (UID: \"34563d27-4e33-4e2d-bf32-c118f5855139\") " pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.360774 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.360725 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlmc9\" (UniqueName: \"kubernetes.io/projected/6d8e7b20-2410-4675-a443-408c37cdef11-kube-api-access-tlmc9\") pod \"node-ca-7pgjs\" (UID: \"6d8e7b20-2410-4675-a443-408c37cdef11\") " pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.374076 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.374049 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrf2\" (UniqueName: \"kubernetes.io/projected/5be049ef-a2de-4653-8547-eaa092ea4f87-kube-api-access-bnrf2\") pod \"aws-ebs-csi-driver-node-hv45x\" (UID: \"5be049ef-a2de-4653-8547-eaa092ea4f87\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.374190 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.374171 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cj5j\" (UniqueName: \"kubernetes.io/projected/e5028c4f-ef6b-4051-a2c3-1def0a14889f-kube-api-access-6cj5j\") pod \"ovnkube-node-7kvjv\" (UID: \"e5028c4f-ef6b-4051-a2c3-1def0a14889f\") " pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.374244 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.374179 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8btr\" (UniqueName: \"kubernetes.io/projected/6fcca08a-b4ba-4f45-862a-1e503776cfe8-kube-api-access-c8btr\") pod \"iptables-alerter-cqnxb\" (UID: \"6fcca08a-b4ba-4f45-862a-1e503776cfe8\") " pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.454583 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454548 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gvl\" (UniqueName: \"kubernetes.io/projected/c4524175-6c2f-4026-ac93-751748e5a1c4-kube-api-access-b5gvl\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454583 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-system-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-multus\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454733 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-hostroot\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454759 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-os-release\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454788 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-cnibin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-conf-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.454837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-multus-certs\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454841 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-system-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454881 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454887 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-multus-certs\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454919 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-netns\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.454987 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.454989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.455049 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.955034162 +0000 UTC m=+3.140322497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455070 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-socket-dir-parent\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-k8s-cni-cncf-io\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455104 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455125 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-system-cni-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455140 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-os-release\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455152 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-binary-copy\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-kubelet\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.455212 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-bin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-daemon-config\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-cni-binary-copy\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455301 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-etc-kubernetes\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455318 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-socket-dir-parent\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455327 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlkv6\" (UniqueName: \"kubernetes.io/projected/abed56c4-528e-496e-b85c-a6fe11c4f6e3-kube-api-access-nlkv6\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-k8s-cni-cncf-io\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455358 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-279xl\" (UniqueName: \"kubernetes.io/projected/9f38edd3-fb52-42bc-b164-d84e78cffcc0-kube-api-access-279xl\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cnibin\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455453 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455177 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-kubelet\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455485 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-system-cni-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455522 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-run-netns\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-cnibin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-cni-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-multus\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-host-var-lib-cni-bin\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.456099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455638 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-os-release\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-hostroot\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-etc-kubernetes\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455717 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-os-release\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455770 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-conf-dir\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abed56c4-528e-496e-b85c-a6fe11c4f6e3-cnibin\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.455919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-multus-daemon-config\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.456215 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c4524175-6c2f-4026-ac93-751748e5a1c4-cni-binary-copy\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.457058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.456352 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abed56c4-528e-496e-b85c-a6fe11c4f6e3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.469755 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.469724 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:27.469755 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.469745 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:27.469755 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.469754 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.470048 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.469820 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.969801637 +0000 UTC m=+3.155089977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.470748 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.470724 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlkv6\" (UniqueName: \"kubernetes.io/projected/abed56c4-528e-496e-b85c-a6fe11c4f6e3-kube-api-access-nlkv6\") pod \"multus-additional-cni-plugins-64swc\" (UID: \"abed56c4-528e-496e-b85c-a6fe11c4f6e3\") " pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.471482 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.471462 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gvl\" (UniqueName: \"kubernetes.io/projected/c4524175-6c2f-4026-ac93-751748e5a1c4-kube-api-access-b5gvl\") pod \"multus-sslzj\" (UID: \"c4524175-6c2f-4026-ac93-751748e5a1c4\") " pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.471558 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.471536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-279xl\" (UniqueName: \"kubernetes.io/projected/9f38edd3-fb52-42bc-b164-d84e78cffcc0-kube-api-access-279xl\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.537539 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.537438 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9lr5f" Apr 24 21:27:27.545443 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.545416 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:27.555236 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.555204 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" Apr 24 21:27:27.560785 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.560761 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" Apr 24 21:27:27.569403 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.569383 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7pgjs" Apr 24 21:27:27.575914 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.575895 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cqnxb" Apr 24 21:27:27.582510 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.582491 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:27.590012 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.589996 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-64swc" Apr 24 21:27:27.594377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.594358 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:27.594491 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.594469 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sslzj" Apr 24 21:27:27.958983 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:27.958897 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:27.959117 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.959061 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.959156 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:27.959124 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:28.95911003 +0000 UTC m=+4.144398352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.978149 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:27.977921 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d8e7b20_2410_4675_a443_408c37cdef11.slice/crio-65ba632ec9e314a99822ceb4143237ec4022ebd7b0fe28c9b73e1858fcf65e63 WatchSource:0}: Error finding container 65ba632ec9e314a99822ceb4143237ec4022ebd7b0fe28c9b73e1858fcf65e63: Status 404 returned error can't find the container with id 65ba632ec9e314a99822ceb4143237ec4022ebd7b0fe28c9b73e1858fcf65e63 Apr 24 21:27:27.980260 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:27.980217 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34563d27_4e33_4e2d_bf32_c118f5855139.slice/crio-905d3ccddcf350565775519064127f407380d5723abe3d3dc287a149a844fca0 WatchSource:0}: Error finding container 905d3ccddcf350565775519064127f407380d5723abe3d3dc287a149a844fca0: Status 404 returned error can't find the container with id 905d3ccddcf350565775519064127f407380d5723abe3d3dc287a149a844fca0 Apr 24 21:27:27.981770 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:27.981681 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5028c4f_ef6b_4051_a2c3_1def0a14889f.slice/crio-7cb067560c59170524c3b75fa325bbce2ae20d2165ecbb90a4d21c1e7336b602 WatchSource:0}: Error finding container 7cb067560c59170524c3b75fa325bbce2ae20d2165ecbb90a4d21c1e7336b602: Status 404 returned error can't find the container with id 7cb067560c59170524c3b75fa325bbce2ae20d2165ecbb90a4d21c1e7336b602 Apr 24 21:27:27.983774 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:27.983755 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30105400_c97c_4cc8_ac91_9f6cfe32780b.slice/crio-3936d786bf44e925d3d7ca2617024458a307c9ae599aa3a2ad62ae65411d5f5b WatchSource:0}: Error finding container 3936d786bf44e925d3d7ca2617024458a307c9ae599aa3a2ad62ae65411d5f5b: Status 404 returned error can't find the container with id 3936d786bf44e925d3d7ca2617024458a307c9ae599aa3a2ad62ae65411d5f5b Apr 24 21:27:28.004925 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:28.004894 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fcca08a_b4ba_4f45_862a_1e503776cfe8.slice/crio-51709512318c23d0b2467023499a2e9ae504004b993acf18d167275c6cb48662 WatchSource:0}: Error finding container 51709512318c23d0b2467023499a2e9ae504004b993acf18d167275c6cb48662: Status 404 returned error can't find the container with id 51709512318c23d0b2467023499a2e9ae504004b993acf18d167275c6cb48662 Apr 24 21:27:28.005847 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:28.005826 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabed56c4_528e_496e_b85c_a6fe11c4f6e3.slice/crio-d4d00674fb5ecbd1c4c3d2f5b1ab35270ee7d077af5c23df15e5e018e060c8ab WatchSource:0}: Error finding container d4d00674fb5ecbd1c4c3d2f5b1ab35270ee7d077af5c23df15e5e018e060c8ab: Status 404 returned error can't find the container with id d4d00674fb5ecbd1c4c3d2f5b1ab35270ee7d077af5c23df15e5e018e060c8ab Apr 24 21:27:28.006330 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:28.006316 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4524175_6c2f_4026_ac93_751748e5a1c4.slice/crio-6a4e395c7eade5eb805c5f19a151919149ee413860dc5fce1e41baf442e367c4 WatchSource:0}: Error finding container 6a4e395c7eade5eb805c5f19a151919149ee413860dc5fce1e41baf442e367c4: Status 404 returned error can't find the container with id 6a4e395c7eade5eb805c5f19a151919149ee413860dc5fce1e41baf442e367c4 Apr 24 21:27:28.007997 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:28.007831 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be049ef_a2de_4653_8547_eaa092ea4f87.slice/crio-4f49666de9f8da0bb405183f4fdafa27b48a0bc94cf333062906da831b00ff2e WatchSource:0}: Error finding container 4f49666de9f8da0bb405183f4fdafa27b48a0bc94cf333062906da831b00ff2e: Status 404 returned error can't find the container with id 4f49666de9f8da0bb405183f4fdafa27b48a0bc94cf333062906da831b00ff2e Apr 24 21:27:28.008764 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:27:28.008742 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod719c9974_f956_4125_bc24_da51ad2c4d61.slice/crio-4004643d0b1d0b8bd53cba1353370d5f0744370896be3e4ca73d65312ec03110 WatchSource:0}: Error finding container 4004643d0b1d0b8bd53cba1353370d5f0744370896be3e4ca73d65312ec03110: Status 404 returned error can't find the container with id 4004643d0b1d0b8bd53cba1353370d5f0744370896be3e4ca73d65312ec03110 Apr 24 21:27:28.060105 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.060081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:28.060224 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.060209 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:28.060266 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.060232 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:28.060266 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.060257 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.060352 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.060311 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.060297558 +0000 UTC m=+4.245585875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:28.230680 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.230584 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:28.297263 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.297223 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:26 +0000 UTC" deadline="2028-01-09 16:30:08.69233156 +0000 UTC" Apr 24 21:27:28.297263 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.297260 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14995h2m40.395078616s" Apr 24 21:27:28.386041 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.385945 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerStarted","Data":"d4d00674fb5ecbd1c4c3d2f5b1ab35270ee7d077af5c23df15e5e018e060c8ab"} Apr 24 21:27:28.388169 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.388126 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cqnxb" event={"ID":"6fcca08a-b4ba-4f45-862a-1e503776cfe8","Type":"ContainerStarted","Data":"51709512318c23d0b2467023499a2e9ae504004b993acf18d167275c6cb48662"} Apr 24 21:27:28.394049 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.394009 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" event={"ID":"ec74c72a1ae3da2b3b1eef59bb72e15d","Type":"ContainerStarted","Data":"758906b5d403b397356cf4b1a33ae3f167ae29f5d981503c202d8885cdb4f1cf"} Apr 24 21:27:28.398040 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.398016 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sslzj" event={"ID":"c4524175-6c2f-4026-ac93-751748e5a1c4","Type":"ContainerStarted","Data":"6a4e395c7eade5eb805c5f19a151919149ee413860dc5fce1e41baf442e367c4"} Apr 24 21:27:28.406055 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.406032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9lr5f" event={"ID":"30105400-c97c-4cc8-ac91-9f6cfe32780b","Type":"ContainerStarted","Data":"3936d786bf44e925d3d7ca2617024458a307c9ae599aa3a2ad62ae65411d5f5b"} Apr 24 21:27:28.409671 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.408336 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"7cb067560c59170524c3b75fa325bbce2ae20d2165ecbb90a4d21c1e7336b602"} Apr 24 21:27:28.413875 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.413830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" event={"ID":"34563d27-4e33-4e2d-bf32-c118f5855139","Type":"ContainerStarted","Data":"905d3ccddcf350565775519064127f407380d5723abe3d3dc287a149a844fca0"} Apr 24 21:27:28.417728 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.417662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7pgjs" event={"ID":"6d8e7b20-2410-4675-a443-408c37cdef11","Type":"ContainerStarted","Data":"65ba632ec9e314a99822ceb4143237ec4022ebd7b0fe28c9b73e1858fcf65e63"} Apr 24 21:27:28.422323 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.422281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vv85h" event={"ID":"719c9974-f956-4125-bc24-da51ad2c4d61","Type":"ContainerStarted","Data":"4004643d0b1d0b8bd53cba1353370d5f0744370896be3e4ca73d65312ec03110"} Apr 24 21:27:28.435544 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.435108 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" event={"ID":"5be049ef-a2de-4653-8547-eaa092ea4f87","Type":"ContainerStarted","Data":"4f49666de9f8da0bb405183f4fdafa27b48a0bc94cf333062906da831b00ff2e"} Apr 24 21:27:28.967222 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:28.967192 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:28.967378 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.967358 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:28.967444 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:28.967425 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.967404761 +0000 UTC m=+6.152693084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:29.068868 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.068296 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:29.068868 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.068459 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:29.068868 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.068477 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:29.068868 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.068489 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.068868 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.068545 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.068528038 +0000 UTC m=+6.253816368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:29.377041 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.376332 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:29.377041 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.376460 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:29.377041 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.376876 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:29.377041 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:29.377002 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:29.446420 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.446341 2566 generic.go:358] "Generic (PLEG): container finished" podID="3dfc4ac8bf9ace5a036e90036a3f5792" containerID="ff3b480f529afbe630da3539ed6c3a8b4253607efe8a173a22a7c77eb8857cda" exitCode=0 Apr 24 21:27:29.447039 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.447015 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerDied","Data":"ff3b480f529afbe630da3539ed6c3a8b4253607efe8a173a22a7c77eb8857cda"} Apr 24 21:27:29.465641 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:29.465593 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" podStartSLOduration=3.465578925 podStartE2EDuration="3.465578925s" podCreationTimestamp="2026-04-24 21:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:28.409756853 +0000 UTC m=+3.595045193" watchObservedRunningTime="2026-04-24 21:27:29.465578925 +0000 UTC m=+4.650867299" Apr 24 21:27:30.466987 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:30.466335 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerStarted","Data":"23861db58b0815ecb07d005021bc66c2ca6a36376623b9f486835c0ebdf938e2"} Apr 24 21:27:30.985394 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:30.983030 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:30.985394 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:30.983174 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.985394 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:30.983239 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:34.983220451 +0000 UTC m=+10.168508780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.083935 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:31.083895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:31.084118 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.084060 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:31.084118 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.084084 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:31.084118 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.084097 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.084288 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.084158 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:35.084140116 +0000 UTC m=+10.269428444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.375865 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:31.375779 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:31.375865 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:31.375802 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:31.376105 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.375915 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:31.376105 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:31.376067 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:33.376353 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:33.376320 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:33.376820 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:33.376317 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:33.376820 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:33.376462 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:33.376820 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:33.376573 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:35.019095 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:35.019053 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:35.019530 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.019203 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:35.019530 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.019279 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.019259113 +0000 UTC m=+18.204547445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:35.119907 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:35.119868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:35.120109 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.120066 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:35.120109 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.120091 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:35.120109 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.120104 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:35.120259 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.120172 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.120153129 +0000 UTC m=+18.305441465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:35.377350 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:35.376811 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:35.377350 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.376902 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:35.377350 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:35.377274 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:35.377606 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:35.377395 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:37.375930 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:37.375888 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:37.376355 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:37.375941 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:37.376355 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:37.376035 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:37.376355 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:37.376160 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:39.375512 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:39.375475 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:39.375994 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:39.375523 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:39.375994 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:39.375613 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:39.375994 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:39.375714 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:41.378636 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:41.378600 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:41.379129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:41.378603 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:41.379129 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:41.378721 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:41.379129 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:41.378822 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:43.076310 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:43.076262 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:43.076988 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.076390 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:43.076988 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.076457 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.076436961 +0000 UTC m=+34.261725279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:43.177172 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:43.177125 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:43.177371 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.177347 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:43.177437 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.177374 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:43.177437 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.177387 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:43.177530 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.177456 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:27:59.177435612 +0000 UTC m=+34.362723948 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:43.375778 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:43.375685 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:43.375940 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.375818 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:43.375940 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:43.375881 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:43.376074 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:43.376013 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:45.380085 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.379792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:45.380733 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.379809 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:45.380733 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:45.380206 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:45.380733 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:45.380233 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:45.494265 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.494234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vv85h" event={"ID":"719c9974-f956-4125-bc24-da51ad2c4d61","Type":"ContainerStarted","Data":"7836794c793d7fa0eaf91b1fe6d7c6bf12873b4fe48b00ee5351dabe9ae9276b"} Apr 24 21:27:45.495536 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.495508 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" event={"ID":"5be049ef-a2de-4653-8547-eaa092ea4f87","Type":"ContainerStarted","Data":"1c358be7067572eaae8f98390bd419bc1059bdcf1fedf8e55068aa5cd69854b9"} Apr 24 21:27:45.496796 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.496771 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="45a4989f8c7d67b172e9ae5e9b3321ec7c1d519e42b014eef6d131b5b9ef96dd" exitCode=0 Apr 24 21:27:45.496878 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.496853 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"45a4989f8c7d67b172e9ae5e9b3321ec7c1d519e42b014eef6d131b5b9ef96dd"} Apr 24 21:27:45.501578 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.498893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sslzj" event={"ID":"c4524175-6c2f-4026-ac93-751748e5a1c4","Type":"ContainerStarted","Data":"2cc6c25d2b778ad71cfb7729095831db390f36248b5ff04798dcfd3c56d343d8"} Apr 24 21:27:45.503047 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.502994 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9lr5f" event={"ID":"30105400-c97c-4cc8-ac91-9f6cfe32780b","Type":"ContainerStarted","Data":"2b8642723d0b9eb8bceff4351c40123580a196a973c5bfeb18e7c904c63d1799"} Apr 24 21:27:45.505627 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.505607 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:27:45.505924 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.505905 2566 generic.go:358] "Generic (PLEG): container finished" podID="e5028c4f-ef6b-4051-a2c3-1def0a14889f" containerID="1befae1701670f7f9d578a29a6f114e2a3c6e07dc22871ae405157ac463cb74b" exitCode=1 Apr 24 21:27:45.506009 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.505979 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"8c17cd5d48d667ed0b19adfad8a8e696d56ba3931a5c3c7b5a08c772cf640a51"} Apr 24 21:27:45.506048 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.506007 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"ec8762b156c2c95967d9689f8d479fbe1fac47ec6400be20634a2e301974d381"} Apr 24 21:27:45.506048 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.506020 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"4aea2cd712a16dd6fc539216c3e33833709fcfaff2bcd2aa8742668a4dad4d6b"} Apr 24 21:27:45.506048 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.506032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerDied","Data":"1befae1701670f7f9d578a29a6f114e2a3c6e07dc22871ae405157ac463cb74b"} Apr 24 21:27:45.506140 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.506048 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"b78537e531066f4c783a8268c0f2c9a3edec8b9c99966510278145cb89837c51"} Apr 24 21:27:45.507572 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.507555 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" event={"ID":"34563d27-4e33-4e2d-bf32-c118f5855139","Type":"ContainerStarted","Data":"53f4093f6bb9e3a1d2b50074a4b243cc7bb2c6f4f0b59637459e904b9784b019"} Apr 24 21:27:45.508670 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.508634 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" podStartSLOduration=19.508619663 podStartE2EDuration="19.508619663s" podCreationTimestamp="2026-04-24 21:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:30.482108859 +0000 UTC m=+5.667397200" watchObservedRunningTime="2026-04-24 21:27:45.508619663 +0000 UTC m=+20.693908006" Apr 24 21:27:45.508759 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.508741 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vv85h" podStartSLOduration=11.700167328 podStartE2EDuration="20.508737137s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.011718316 +0000 UTC m=+3.197006638" lastFinishedPulling="2026-04-24 21:27:36.820288117 +0000 UTC m=+12.005576447" observedRunningTime="2026-04-24 21:27:45.508180125 +0000 UTC m=+20.693468465" watchObservedRunningTime="2026-04-24 21:27:45.508737137 +0000 UTC m=+20.694025475" Apr 24 21:27:45.508873 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.508854 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7pgjs" event={"ID":"6d8e7b20-2410-4675-a443-408c37cdef11","Type":"ContainerStarted","Data":"cd3add1c6643a70d8a12c74b181399179bb000e21cf0c3ae12e1378fb1705764"} Apr 24 21:27:45.542718 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.542612 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6nn9n" podStartSLOduration=3.6674770260000003 podStartE2EDuration="20.542597523s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:27.982584155 +0000 UTC m=+3.167872475" lastFinishedPulling="2026-04-24 21:27:44.85770464 +0000 UTC m=+20.042992972" observedRunningTime="2026-04-24 21:27:45.542350105 +0000 UTC m=+20.727638445" watchObservedRunningTime="2026-04-24 21:27:45.542597523 +0000 UTC m=+20.727885861" Apr 24 21:27:45.558974 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.558917 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sslzj" podStartSLOduration=3.624593217 podStartE2EDuration="20.558902442s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.008160501 +0000 UTC m=+3.193448823" lastFinishedPulling="2026-04-24 21:27:44.942469717 +0000 UTC m=+20.127758048" observedRunningTime="2026-04-24 21:27:45.558493667 +0000 UTC m=+20.743782021" watchObservedRunningTime="2026-04-24 21:27:45.558902442 +0000 UTC m=+20.744190808" Apr 24 21:27:45.575039 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.574996 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9lr5f" podStartSLOduration=3.7226887939999997 podStartE2EDuration="20.574982553s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.003949382 +0000 UTC m=+3.189237699" lastFinishedPulling="2026-04-24 21:27:44.856243125 +0000 UTC m=+20.041531458" observedRunningTime="2026-04-24 21:27:45.574793464 +0000 UTC m=+20.760081815" watchObservedRunningTime="2026-04-24 21:27:45.574982553 +0000 UTC m=+20.760270888" Apr 24 21:27:45.591705 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:45.591659 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7pgjs" podStartSLOduration=3.715488842 podStartE2EDuration="20.591645436s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:27.980112256 +0000 UTC m=+3.165400589" lastFinishedPulling="2026-04-24 21:27:44.856268865 +0000 UTC m=+20.041557183" observedRunningTime="2026-04-24 21:27:45.591232002 +0000 UTC m=+20.776520341" watchObservedRunningTime="2026-04-24 21:27:45.591645436 +0000 UTC m=+20.776933774" Apr 24 21:27:46.196161 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.195970 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:46.307503 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.307402 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:46.196158144Z","UUID":"332395af-2268-4748-9efb-efe9a2414cdd","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:46.309228 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.309203 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:46.309228 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.309234 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:46.512405 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.512370 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" event={"ID":"5be049ef-a2de-4653-8547-eaa092ea4f87","Type":"ContainerStarted","Data":"671bf3455d6097e7cf9e71d44ee9f91c418553f74828d9fbe9d92f42dc3759de"} Apr 24 21:27:46.513782 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.513752 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cqnxb" event={"ID":"6fcca08a-b4ba-4f45-862a-1e503776cfe8","Type":"ContainerStarted","Data":"effb42fc11805ba989a3803b330fbeb6483fec1d11aefc6f012c02fb27d577e5"} Apr 24 21:27:46.516767 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.516745 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:27:46.517176 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:46.517150 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"81dfb2d30191bbd8fc3c32ce2de52da612d3c85bf0287a2521d8848b7a468391"} Apr 24 21:27:47.376042 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:47.375951 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:47.376200 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:47.376086 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:47.376200 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:47.376144 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:47.376318 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:47.376267 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:47.521893 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:47.521830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" event={"ID":"5be049ef-a2de-4653-8547-eaa092ea4f87","Type":"ContainerStarted","Data":"8ea25009e6cf5496e33a5698c61db445c55703d0017fe872f3420232316c4384"} Apr 24 21:27:47.539654 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:47.539603 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hv45x" podStartSLOduration=3.5843049479999998 podStartE2EDuration="22.539588627s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.011843016 +0000 UTC m=+3.197131339" lastFinishedPulling="2026-04-24 21:27:46.967126688 +0000 UTC m=+22.152415018" observedRunningTime="2026-04-24 21:27:47.539095177 +0000 UTC m=+22.724383518" watchObservedRunningTime="2026-04-24 21:27:47.539588627 +0000 UTC m=+22.724876965" Apr 24 21:27:47.539926 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:47.539901 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cqnxb" podStartSLOduration=5.6901643360000005 podStartE2EDuration="22.539894464s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.006616732 +0000 UTC m=+3.191905068" lastFinishedPulling="2026-04-24 21:27:44.856346865 +0000 UTC m=+20.041635196" observedRunningTime="2026-04-24 21:27:46.527797649 +0000 UTC m=+21.713085987" watchObservedRunningTime="2026-04-24 21:27:47.539894464 +0000 UTC m=+22.725182806" Apr 24 21:27:48.527089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:48.527064 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:27:48.527698 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:48.527421 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"fcbaf197987ac56306bbb0eb94a2802e5fce407c5d81094843bbe40f10ac4b48"} Apr 24 21:27:48.561036 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:48.560971 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:48.561708 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:48.561688 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:49.375760 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:49.375720 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:49.375925 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:49.375847 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:49.376025 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:49.375924 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:49.376080 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:49.376029 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:49.528841 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:49.528816 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:49.529463 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:49.529445 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vv85h" Apr 24 21:27:50.536020 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.535830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:27:50.536592 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.536312 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"7333ed9159f970dd9596f0773ea9b8929d82ea2154fcca56ca61c9dd61893752"} Apr 24 21:27:50.536659 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.536604 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:50.536659 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.536628 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:50.536873 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.536851 2566 scope.go:117] "RemoveContainer" containerID="1befae1701670f7f9d578a29a6f114e2a3c6e07dc22871ae405157ac463cb74b" Apr 24 21:27:50.538011 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.537989 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="08791d45bbe71d52692c959beaecfe6513c71a3e7220d0bce6205e894fe67748" exitCode=0 Apr 24 21:27:50.538087 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.538062 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"08791d45bbe71d52692c959beaecfe6513c71a3e7220d0bce6205e894fe67748"} Apr 24 21:27:50.554226 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:50.554204 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:51.375830 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.375795 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:51.375994 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.375807 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:51.375994 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:51.375911 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:51.375994 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:51.375972 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:51.542233 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.542210 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:27:51.542588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.542504 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" event={"ID":"e5028c4f-ef6b-4051-a2c3-1def0a14889f","Type":"ContainerStarted","Data":"7c724897c44e6882f4ea357eae80c7cba54b6bd70afa66f93f1f932636a87dd9"} Apr 24 21:27:51.543006 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.542987 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:51.557505 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.557481 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:27:51.570688 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.570647 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" podStartSLOduration=9.6565105 podStartE2EDuration="26.570633002s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.004049381 +0000 UTC m=+3.189337702" lastFinishedPulling="2026-04-24 21:27:44.918171883 +0000 UTC m=+20.103460204" observedRunningTime="2026-04-24 21:27:51.569685699 +0000 UTC m=+26.754974038" watchObservedRunningTime="2026-04-24 21:27:51.570633002 +0000 UTC m=+26.755921341" Apr 24 21:27:51.725668 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.725497 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kwjx6"] Apr 24 21:27:51.725808 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.725745 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:51.725864 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:51.725842 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:51.729737 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.729713 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5b7z5"] Apr 24 21:27:51.729836 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:51.729823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:51.729917 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:51.729901 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:52.546005 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:52.545951 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="7df6fa9f0e0c0a371b1592416512d9d9623420913187bda2fa1dc3d2f89fcc48" exitCode=0 Apr 24 21:27:52.546005 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:52.545984 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"7df6fa9f0e0c0a371b1592416512d9d9623420913187bda2fa1dc3d2f89fcc48"} Apr 24 21:27:52.905300 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:52.905229 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9lr5f_30105400-c97c-4cc8-ac91-9f6cfe32780b/dns-node-resolver/0.log" Apr 24 21:27:53.376129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:53.376093 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:53.376280 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:53.376200 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:53.376280 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:53.376229 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:53.376378 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:53.376286 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:54.092486 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:54.092463 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7pgjs_6d8e7b20-2410-4675-a443-408c37cdef11/node-ca/0.log" Apr 24 21:27:54.551747 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:54.551715 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="9bf2d4b29795b534e5631cad8ca0711d6be476802deda1740919592c17b39e42" exitCode=0 Apr 24 21:27:54.551747 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:54.551755 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"9bf2d4b29795b534e5631cad8ca0711d6be476802deda1740919592c17b39e42"} Apr 24 21:27:55.376429 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:55.376399 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:55.377116 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:55.376511 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:55.377116 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:55.376593 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:55.377116 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:55.376739 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:57.376259 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:57.376199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:57.376732 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:57.376209 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:57.376732 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:57.376330 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:27:57.376732 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:57.376406 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:59.098916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:59.098880 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:59.099407 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.099039 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:59.099407 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.099120 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs podName:9f38edd3-fb52-42bc-b164-d84e78cffcc0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:31.099102257 +0000 UTC m=+66.284390575 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs") pod "network-metrics-daemon-5b7z5" (UID: "9f38edd3-fb52-42bc-b164-d84e78cffcc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:59.199782 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:59.199738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:59.199979 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.199903 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:59.199979 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.199924 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:59.199979 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.199934 2566 projected.go:194] Error preparing data for projected volume kube-api-access-xjkbt for pod openshift-network-diagnostics/network-check-target-kwjx6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:59.200140 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.200000 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt podName:7799825c-48bb-465a-adc5-2d2c43a525df nodeName:}" failed. No retries permitted until 2026-04-24 21:28:31.199984064 +0000 UTC m=+66.385272381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xjkbt" (UniqueName: "kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt") pod "network-check-target-kwjx6" (UID: "7799825c-48bb-465a-adc5-2d2c43a525df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:59.375849 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:59.375768 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:27:59.376040 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:27:59.375768 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:27:59.376040 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.375914 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:27:59.376040 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:27:59.375943 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:01.375810 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:01.375630 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:01.376176 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:01.375645 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:01.376176 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:01.375892 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:01.376176 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:01.375978 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:01.566598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:01.566517 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="0c3223e1f5afc80b302b17e95edff8d53202ec54e1fd175a1c3ef19ae6d89e7b" exitCode=0 Apr 24 21:28:01.566598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:01.566577 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"0c3223e1f5afc80b302b17e95edff8d53202ec54e1fd175a1c3ef19ae6d89e7b"} Apr 24 21:28:02.571805 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:02.571767 2566 generic.go:358] "Generic (PLEG): container finished" podID="abed56c4-528e-496e-b85c-a6fe11c4f6e3" containerID="2097778dace08b71cd783690494ba096c64e33d1fe7057a4c4135a5889ac0e4b" exitCode=0 Apr 24 21:28:02.572290 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:02.571817 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerDied","Data":"2097778dace08b71cd783690494ba096c64e33d1fe7057a4c4135a5889ac0e4b"} Apr 24 21:28:03.375773 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:03.375739 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:03.375998 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:03.375739 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:03.375998 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:03.375853 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:03.375998 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:03.375909 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:03.576711 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:03.576680 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-64swc" event={"ID":"abed56c4-528e-496e-b85c-a6fe11c4f6e3","Type":"ContainerStarted","Data":"3f8266b126fe7d7020c49f6b21d008d6485522fdd3f31035a9d56781a6d7e3a2"} Apr 24 21:28:03.599898 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:03.599849 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-64swc" podStartSLOduration=6.158595822 podStartE2EDuration="38.599834915s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:28.007644476 +0000 UTC m=+3.192932793" lastFinishedPulling="2026-04-24 21:28:00.448883556 +0000 UTC m=+35.634171886" observedRunningTime="2026-04-24 21:28:03.59901972 +0000 UTC m=+38.784308058" watchObservedRunningTime="2026-04-24 21:28:03.599834915 +0000 UTC m=+38.785123254" Apr 24 21:28:05.376389 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:05.376349 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:05.376837 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:05.376451 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:05.376837 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:05.376514 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:05.376837 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:05.376599 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:07.375835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:07.375792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:07.375835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:07.375818 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:07.376379 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:07.375894 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:07.376379 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:07.376024 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:09.375996 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:09.375938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:09.376448 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:09.375938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:09.376448 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:09.376060 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:09.376448 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:09.376201 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:11.375599 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:11.375567 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:11.376065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:11.375571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:11.376065 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:11.375666 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:11.376065 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:11.375758 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:13.375872 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:13.375832 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:13.375872 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:13.375862 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:13.376449 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:13.375992 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:13.376449 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:13.376065 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:15.376185 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:15.376148 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:15.376680 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:15.376243 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kwjx6" podUID="7799825c-48bb-465a-adc5-2d2c43a525df" Apr 24 21:28:15.376680 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:15.376271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:15.376680 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:15.376360 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5b7z5" podUID="9f38edd3-fb52-42bc-b164-d84e78cffcc0" Apr 24 21:28:17.141914 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.141883 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeReady" Apr 24 21:28:17.142407 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.142045 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:17.188500 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.188470 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-847fd97c69-9sv27"] Apr 24 21:28:17.193138 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.193120 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.195039 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.195015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:28:17.195435 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.195417 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:28:17.195528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.195440 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ttx96\"" Apr 24 21:28:17.198365 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.198341 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:28:17.202129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.202112 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:28:17.209261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.209233 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-847fd97c69-9sv27"] Apr 24 21:28:17.225280 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.225253 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hwj8d"] Apr 24 21:28:17.228154 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.228134 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.230117 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.230095 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:17.230221 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.230095 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:17.230221 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.230183 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m29p7\"" Apr 24 21:28:17.233380 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233360 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-image-registry-private-configuration\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233469 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233387 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64b8fe7c-275c-42d9-88e4-e27695b15732-ca-trust-extracted\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233469 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233415 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-installation-pull-secrets\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233469 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmlft\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-kube-api-access-dmlft\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233469 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzzqm\" (UniqueName: \"kubernetes.io/projected/5b9e2681-cf55-4344-bdcc-7a3176e775c3-kube-api-access-rzzqm\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.233590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233512 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-tls\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-trusted-ca\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233579 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9e2681-cf55-4344-bdcc-7a3176e775c3-config-volume\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.233725 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233605 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-bound-sa-token\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233725 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-certificates\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.233725 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233640 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9e2681-cf55-4344-bdcc-7a3176e775c3-tmp-dir\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.233725 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.233660 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9e2681-cf55-4344-bdcc-7a3176e775c3-metrics-tls\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.239204 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.239183 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwj8d"] Apr 24 21:28:17.322539 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.322502 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fgkjf"] Apr 24 21:28:17.325816 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.325794 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6276r"] Apr 24 21:28:17.325979 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.325945 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.327828 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.327807 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j9cg5\"" Apr 24 21:28:17.327828 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.327813 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.327976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.327844 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:17.327976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.327847 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.328674 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.328657 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.330531 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.330512 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hhbdt\"" Apr 24 21:28:17.330687 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.330672 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.330779 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.330765 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:17.330830 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.330779 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.330906 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.330883 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:17.333949 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.333931 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzqm\" (UniqueName: \"kubernetes.io/projected/5b9e2681-cf55-4344-bdcc-7a3176e775c3-kube-api-access-rzzqm\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.334046 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.333982 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-data-volume\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.334046 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334032 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.334116 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89t94\" (UniqueName: \"kubernetes.io/projected/117a76ab-9714-4cfb-a82c-ed5796386584-kube-api-access-89t94\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.334156 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-tls\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.334243 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-trusted-ca\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.334301 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9e2681-cf55-4344-bdcc-7a3176e775c3-config-volume\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.334406 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334297 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/117a76ab-9714-4cfb-a82c-ed5796386584-cert\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.334406 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-bound-sa-token\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.334780 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-certificates\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.334870 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv96x\" (UniqueName: \"kubernetes.io/projected/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-api-access-xv96x\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.334870 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9e2681-cf55-4344-bdcc-7a3176e775c3-tmp-dir\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.334994 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9e2681-cf55-4344-bdcc-7a3176e775c3-metrics-tls\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.334994 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334937 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.335095 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.334995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-image-registry-private-configuration\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335188 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335163 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9e2681-cf55-4344-bdcc-7a3176e775c3-config-volume\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.335236 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64b8fe7c-275c-42d9-88e4-e27695b15732-ca-trust-extracted\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335287 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335239 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-crio-socket\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.335335 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335284 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9e2681-cf55-4344-bdcc-7a3176e775c3-tmp-dir\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.335335 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335287 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-installation-pull-secrets\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335429 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335398 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmlft\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-kube-api-access-dmlft\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335658 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335636 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-trusted-ca\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335709 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335638 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/64b8fe7c-275c-42d9-88e4-e27695b15732-ca-trust-extracted\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.335709 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.335638 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-certificates\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.341182 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.340262 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9e2681-cf55-4344-bdcc-7a3176e775c3-metrics-tls\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.341182 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.341140 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6276r"] Apr 24 21:28:17.341367 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.341210 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fgkjf"] Apr 24 21:28:17.342653 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.342627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-image-registry-private-configuration\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.343709 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.343689 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/64b8fe7c-275c-42d9-88e4-e27695b15732-installation-pull-secrets\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.343835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.343819 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-registry-tls\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.351101 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.351079 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzzqm\" (UniqueName: \"kubernetes.io/projected/5b9e2681-cf55-4344-bdcc-7a3176e775c3-kube-api-access-rzzqm\") pod \"dns-default-hwj8d\" (UID: \"5b9e2681-cf55-4344-bdcc-7a3176e775c3\") " pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.354296 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.354270 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-bound-sa-token\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.354415 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.354323 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmlft\" (UniqueName: \"kubernetes.io/projected/64b8fe7c-275c-42d9-88e4-e27695b15732-kube-api-access-dmlft\") pod \"image-registry-847fd97c69-9sv27\" (UID: \"64b8fe7c-275c-42d9-88e4-e27695b15732\") " pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.376015 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.375985 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:17.376200 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.376180 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:17.377820 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.377799 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.377922 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.377867 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xwzc6\"" Apr 24 21:28:17.378000 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.377799 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:17.378180 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.378159 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:17.378288 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.378211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:28:17.436633 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/117a76ab-9714-4cfb-a82c-ed5796386584-cert\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436645 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv96x\" (UniqueName: \"kubernetes.io/projected/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-api-access-xv96x\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436723 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-crio-socket\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-data-volume\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.436819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.436806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89t94\" (UniqueName: \"kubernetes.io/projected/117a76ab-9714-4cfb-a82c-ed5796386584-kube-api-access-89t94\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.437131 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.437114 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-crio-socket\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.437502 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.437475 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.437590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.437522 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-data-volume\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.439074 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.439052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.439213 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.439197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/117a76ab-9714-4cfb-a82c-ed5796386584-cert\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.446692 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.446660 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv96x\" (UniqueName: \"kubernetes.io/projected/a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c-kube-api-access-xv96x\") pod \"insights-runtime-extractor-6276r\" (UID: \"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c\") " pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.446863 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.446847 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89t94\" (UniqueName: \"kubernetes.io/projected/117a76ab-9714-4cfb-a82c-ed5796386584-kube-api-access-89t94\") pod \"ingress-canary-fgkjf\" (UID: \"117a76ab-9714-4cfb-a82c-ed5796386584\") " pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.502745 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.502709 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:17.536719 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.536675 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:17.641559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.641529 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fgkjf" Apr 24 21:28:17.645503 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.645472 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-847fd97c69-9sv27"] Apr 24 21:28:17.649498 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:17.649382 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b8fe7c_275c_42d9_88e4_e27695b15732.slice/crio-4f3a6024e13b9b404e0f454f03f3eb408602a589c1dabf934ac8373b84af384c WatchSource:0}: Error finding container 4f3a6024e13b9b404e0f454f03f3eb408602a589c1dabf934ac8373b84af384c: Status 404 returned error can't find the container with id 4f3a6024e13b9b404e0f454f03f3eb408602a589c1dabf934ac8373b84af384c Apr 24 21:28:17.662474 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.662443 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwj8d"] Apr 24 21:28:17.664911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.664700 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6276r" Apr 24 21:28:17.667520 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:17.667479 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9e2681_cf55_4344_bdcc_7a3176e775c3.slice/crio-79be240226264e20286b35f423a25ba3815457c6eb4c9b770b6147c1deaff7d4 WatchSource:0}: Error finding container 79be240226264e20286b35f423a25ba3815457c6eb4c9b770b6147c1deaff7d4: Status 404 returned error can't find the container with id 79be240226264e20286b35f423a25ba3815457c6eb4c9b770b6147c1deaff7d4 Apr 24 21:28:17.776323 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.776291 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fgkjf"] Apr 24 21:28:17.779688 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:17.779660 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117a76ab_9714_4cfb_a82c_ed5796386584.slice/crio-3414cc1a87e7e342c3e676f159d4d53e4fe3babca71ed313c7f2d5106fbe1cd6 WatchSource:0}: Error finding container 3414cc1a87e7e342c3e676f159d4d53e4fe3babca71ed313c7f2d5106fbe1cd6: Status 404 returned error can't find the container with id 3414cc1a87e7e342c3e676f159d4d53e4fe3babca71ed313c7f2d5106fbe1cd6 Apr 24 21:28:17.796527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:17.796504 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6276r"] Apr 24 21:28:17.800638 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:17.800615 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda273a8f0_e8c5_4a44_a5c8_ae8cd3910e9c.slice/crio-b25a94cad1c1f7d7caa6249f19bd3f16b65e08054480b6e3b948dc6dca25cdc8 WatchSource:0}: Error finding container b25a94cad1c1f7d7caa6249f19bd3f16b65e08054480b6e3b948dc6dca25cdc8: Status 404 returned error can't find the container with id b25a94cad1c1f7d7caa6249f19bd3f16b65e08054480b6e3b948dc6dca25cdc8 Apr 24 21:28:18.608774 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.608690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwj8d" event={"ID":"5b9e2681-cf55-4344-bdcc-7a3176e775c3","Type":"ContainerStarted","Data":"79be240226264e20286b35f423a25ba3815457c6eb4c9b770b6147c1deaff7d4"} Apr 24 21:28:18.610481 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.610448 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" event={"ID":"64b8fe7c-275c-42d9-88e4-e27695b15732","Type":"ContainerStarted","Data":"56aac6eb1e3518fc90b3524500feb8d5245605a48e688cc08bc9c64360edf90e"} Apr 24 21:28:18.610638 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.610489 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" event={"ID":"64b8fe7c-275c-42d9-88e4-e27695b15732","Type":"ContainerStarted","Data":"4f3a6024e13b9b404e0f454f03f3eb408602a589c1dabf934ac8373b84af384c"} Apr 24 21:28:18.610638 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.610558 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:18.612007 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.611981 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6276r" event={"ID":"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c","Type":"ContainerStarted","Data":"a5e7ecce0608dab60ddad32e9408ba87534a04d09726c97ea024d56cbb2f84ba"} Apr 24 21:28:18.612120 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.612016 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6276r" event={"ID":"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c","Type":"ContainerStarted","Data":"b25a94cad1c1f7d7caa6249f19bd3f16b65e08054480b6e3b948dc6dca25cdc8"} Apr 24 21:28:18.613137 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.613107 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fgkjf" event={"ID":"117a76ab-9714-4cfb-a82c-ed5796386584","Type":"ContainerStarted","Data":"3414cc1a87e7e342c3e676f159d4d53e4fe3babca71ed313c7f2d5106fbe1cd6"} Apr 24 21:28:18.629846 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:18.629788 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" podStartSLOduration=2.629767558 podStartE2EDuration="2.629767558s" podCreationTimestamp="2026-04-24 21:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:18.629337111 +0000 UTC m=+53.814625451" watchObservedRunningTime="2026-04-24 21:28:18.629767558 +0000 UTC m=+53.815055889" Apr 24 21:28:19.620221 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:19.620180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwj8d" event={"ID":"5b9e2681-cf55-4344-bdcc-7a3176e775c3","Type":"ContainerStarted","Data":"afcd9e956b6cf703d52d362f692bb4b7b97e2dd4e9f609a6d4b577aeef1b9e65"} Apr 24 21:28:19.623759 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:19.623731 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6276r" event={"ID":"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c","Type":"ContainerStarted","Data":"570a4569bffe8ce856b9f3689331fa719cb3dd1fc2a0c6c7fdb8ab6aac31f576"} Apr 24 21:28:19.628577 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:19.628536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fgkjf" event={"ID":"117a76ab-9714-4cfb-a82c-ed5796386584","Type":"ContainerStarted","Data":"571adf90d2be4f08ad93585693a4519d5b2706bfbfdaf8ab5d7f93742cc0ab66"} Apr 24 21:28:19.647239 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:19.647180 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fgkjf" podStartSLOduration=0.956326733 podStartE2EDuration="2.647163299s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:17.781736761 +0000 UTC m=+52.967025078" lastFinishedPulling="2026-04-24 21:28:19.472573327 +0000 UTC m=+54.657861644" observedRunningTime="2026-04-24 21:28:19.646984164 +0000 UTC m=+54.832272502" watchObservedRunningTime="2026-04-24 21:28:19.647163299 +0000 UTC m=+54.832451638" Apr 24 21:28:20.633043 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:20.632997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwj8d" event={"ID":"5b9e2681-cf55-4344-bdcc-7a3176e775c3","Type":"ContainerStarted","Data":"9b806b099e9e7c579d83bf0ac08a057d16b812da126f88bf8d608db664ca8553"} Apr 24 21:28:20.650125 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:20.649865 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hwj8d" podStartSLOduration=1.851273998 podStartE2EDuration="3.649850984s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:17.669854126 +0000 UTC m=+52.855142449" lastFinishedPulling="2026-04-24 21:28:19.468431107 +0000 UTC m=+54.653719435" observedRunningTime="2026-04-24 21:28:20.64963209 +0000 UTC m=+55.834920430" watchObservedRunningTime="2026-04-24 21:28:20.649850984 +0000 UTC m=+55.835139332" Apr 24 21:28:21.637499 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:21.637466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6276r" event={"ID":"a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c","Type":"ContainerStarted","Data":"605334a35fb546cf720b4e0edf540fac85d9e843aeff13750219e7fe7d32aef8"} Apr 24 21:28:21.637915 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:21.637706 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:21.655541 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:21.655480 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6276r" podStartSLOduration=1.775250121 podStartE2EDuration="4.655459886s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:17.875535588 +0000 UTC m=+53.060823911" lastFinishedPulling="2026-04-24 21:28:20.755745345 +0000 UTC m=+55.941033676" observedRunningTime="2026-04-24 21:28:21.655011011 +0000 UTC m=+56.840299349" watchObservedRunningTime="2026-04-24 21:28:21.655459886 +0000 UTC m=+56.840748233" Apr 24 21:28:23.558336 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:23.558308 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7kvjv" Apr 24 21:28:25.348401 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.348367 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt"] Apr 24 21:28:25.352580 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.352563 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.355319 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.355299 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 21:28:25.355498 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.355481 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:28:25.355587 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.355545 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-4tzgt\"" Apr 24 21:28:25.355634 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.355589 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:25.359016 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.356301 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:25.362244 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.362226 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:25.369174 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.369151 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt"] Apr 24 21:28:25.395628 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.395599 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.395780 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.395637 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b960a6d8-6c0d-4f69-b153-4c3176aa6145-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.395780 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.395702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.395780 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.395725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqxf\" (UniqueName: \"kubernetes.io/projected/b960a6d8-6c0d-4f69-b153-4c3176aa6145-kube-api-access-lbqxf\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.408653 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.408624 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-r2tp8"] Apr 24 21:28:25.411570 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.411550 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.413600 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.413575 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:25.413757 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.413738 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:25.413835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.413738 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5pldf\"" Apr 24 21:28:25.414095 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.414077 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:25.496693 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-tls\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.496693 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496695 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.496895 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496712 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-root\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.496895 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496761 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-metrics-client-ca\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.496895 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496836 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.496895 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496870 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68qt\" (UniqueName: \"kubernetes.io/projected/79217fe6-125e-4107-b954-edf725cbf5a4-kube-api-access-z68qt\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.497043 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.497043 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqxf\" (UniqueName: \"kubernetes.io/projected/b960a6d8-6c0d-4f69-b153-4c3176aa6145-kube-api-access-lbqxf\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.497043 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.496942 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-textfile\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.497129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.497081 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-sys\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.497129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.497116 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-wtmp\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.497193 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.497148 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b960a6d8-6c0d-4f69-b153-4c3176aa6145-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.497232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.497189 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.497317 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:25.497300 2566 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 21:28:25.497394 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:28:25.497383 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls podName:b960a6d8-6c0d-4f69-b153-4c3176aa6145 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:25.997364998 +0000 UTC m=+61.182653322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-2gdkt" (UID: "b960a6d8-6c0d-4f69-b153-4c3176aa6145") : secret "openshift-state-metrics-tls" not found Apr 24 21:28:25.497737 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.497716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b960a6d8-6c0d-4f69-b153-4c3176aa6145-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.499284 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.499255 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.507228 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.507207 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqxf\" (UniqueName: \"kubernetes.io/projected/b960a6d8-6c0d-4f69-b153-4c3176aa6145-kube-api-access-lbqxf\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:25.598458 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598458 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-root\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598458 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598428 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-metrics-client-ca\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-root\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598599 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z68qt\" (UniqueName: \"kubernetes.io/projected/79217fe6-125e-4107-b954-edf725cbf5a4-kube-api-access-z68qt\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598643 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-textfile\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-sys\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598930 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598736 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-wtmp\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598930 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598821 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-sys\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.598930 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-wtmp\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.599061 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.598927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-tls\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.599061 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.599004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-textfile\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.599192 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.599172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.599227 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.599172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79217fe6-125e-4107-b954-edf725cbf5a4-metrics-client-ca\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.600752 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.600731 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.601062 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.601046 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/79217fe6-125e-4107-b954-edf725cbf5a4-node-exporter-tls\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.610598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.610575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68qt\" (UniqueName: \"kubernetes.io/projected/79217fe6-125e-4107-b954-edf725cbf5a4-kube-api-access-z68qt\") pod \"node-exporter-r2tp8\" (UID: \"79217fe6-125e-4107-b954-edf725cbf5a4\") " pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.720452 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:25.720411 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-r2tp8" Apr 24 21:28:25.728182 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:25.728143 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79217fe6_125e_4107_b954_edf725cbf5a4.slice/crio-2010a411ab6d16211f636428516eac2adfa1973b923d257fd07aba03f836fbd1 WatchSource:0}: Error finding container 2010a411ab6d16211f636428516eac2adfa1973b923d257fd07aba03f836fbd1: Status 404 returned error can't find the container with id 2010a411ab6d16211f636428516eac2adfa1973b923d257fd07aba03f836fbd1 Apr 24 21:28:26.001912 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.001868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:26.004231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.004206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b960a6d8-6c0d-4f69-b153-4c3176aa6145-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2gdkt\" (UID: \"b960a6d8-6c0d-4f69-b153-4c3176aa6145\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:26.262799 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.262718 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" Apr 24 21:28:26.455736 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.455713 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt"] Apr 24 21:28:26.458797 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:26.458777 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb960a6d8_6c0d_4f69_b153_4c3176aa6145.slice/crio-ceb34a878ff9717575a0722cc724a0ea089f6fb7df5a5f2ff1bac57a0134cd25 WatchSource:0}: Error finding container ceb34a878ff9717575a0722cc724a0ea089f6fb7df5a5f2ff1bac57a0134cd25: Status 404 returned error can't find the container with id ceb34a878ff9717575a0722cc724a0ea089f6fb7df5a5f2ff1bac57a0134cd25 Apr 24 21:28:26.654790 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.654757 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" event={"ID":"b960a6d8-6c0d-4f69-b153-4c3176aa6145","Type":"ContainerStarted","Data":"8457e55c5b31efb600731175b6b61990ba5fad40d84168491c332d0356eba19b"} Apr 24 21:28:26.655002 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.654799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" event={"ID":"b960a6d8-6c0d-4f69-b153-4c3176aa6145","Type":"ContainerStarted","Data":"e20689f1ceef5cfb41ec60ee27a63ac9e3af5830211e62ff71436ae2ed6f5faf"} Apr 24 21:28:26.655002 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.654863 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" event={"ID":"b960a6d8-6c0d-4f69-b153-4c3176aa6145","Type":"ContainerStarted","Data":"ceb34a878ff9717575a0722cc724a0ea089f6fb7df5a5f2ff1bac57a0134cd25"} Apr 24 21:28:26.656142 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.656113 2566 generic.go:358] "Generic (PLEG): container finished" podID="79217fe6-125e-4107-b954-edf725cbf5a4" containerID="0a2602da3ae36b65cbcbeb987233908c0ea631203cd0be6adefbeac7402da6b8" exitCode=0 Apr 24 21:28:26.656249 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.656142 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r2tp8" event={"ID":"79217fe6-125e-4107-b954-edf725cbf5a4","Type":"ContainerDied","Data":"0a2602da3ae36b65cbcbeb987233908c0ea631203cd0be6adefbeac7402da6b8"} Apr 24 21:28:26.656249 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:26.656177 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r2tp8" event={"ID":"79217fe6-125e-4107-b954-edf725cbf5a4","Type":"ContainerStarted","Data":"2010a411ab6d16211f636428516eac2adfa1973b923d257fd07aba03f836fbd1"} Apr 24 21:28:27.660517 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:27.660471 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" event={"ID":"b960a6d8-6c0d-4f69-b153-4c3176aa6145","Type":"ContainerStarted","Data":"acba328fe230629490f446b5d5a9a18d045ab4e9b6a9b0fad2719662a4a09666"} Apr 24 21:28:27.662257 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:27.662234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r2tp8" event={"ID":"79217fe6-125e-4107-b954-edf725cbf5a4","Type":"ContainerStarted","Data":"707d8d42ba44d70b923e8160c7daf1d78aeebf1d1112b8798a29d1b2c40df772"} Apr 24 21:28:27.662353 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:27.662262 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-r2tp8" event={"ID":"79217fe6-125e-4107-b954-edf725cbf5a4","Type":"ContainerStarted","Data":"d1e679008ce12abdfc63d1a745ce164b62fedcf9ae97e27f25faf163767bed76"} Apr 24 21:28:27.677058 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:27.677000 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2gdkt" podStartSLOduration=1.753245997 podStartE2EDuration="2.676984429s" podCreationTimestamp="2026-04-24 21:28:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:26.630932992 +0000 UTC m=+61.816221310" lastFinishedPulling="2026-04-24 21:28:27.554671425 +0000 UTC m=+62.739959742" observedRunningTime="2026-04-24 21:28:27.676436235 +0000 UTC m=+62.861724609" watchObservedRunningTime="2026-04-24 21:28:27.676984429 +0000 UTC m=+62.862272769" Apr 24 21:28:27.694557 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:27.694514 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-r2tp8" podStartSLOduration=2.039409266 podStartE2EDuration="2.694497688s" podCreationTimestamp="2026-04-24 21:28:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:25.729674987 +0000 UTC m=+60.914963307" lastFinishedPulling="2026-04-24 21:28:26.384763398 +0000 UTC m=+61.570051729" observedRunningTime="2026-04-24 21:28:27.693980001 +0000 UTC m=+62.879268332" watchObservedRunningTime="2026-04-24 21:28:27.694497688 +0000 UTC m=+62.879786027" Apr 24 21:28:28.304495 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.304459 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c88dc86bd-p22fx"] Apr 24 21:28:28.309203 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.309182 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.312422 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.312402 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:28:28.312560 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.312541 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2jbpa13gckimr\"" Apr 24 21:28:28.312651 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.312630 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:28:28.313063 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.313046 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:28:28.313216 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.313201 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ztz2j\"" Apr 24 21:28:28.313262 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.313235 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:28:28.313311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.313268 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:28:28.320020 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.320001 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c88dc86bd-p22fx"] Apr 24 21:28:28.421309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421270 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-grpc-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb8166d-889c-49d2-b8fa-9193325b23a7-metrics-client-ca\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421435 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421473 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421493 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkmzl\" (UniqueName: \"kubernetes.io/projected/6eb8166d-889c-49d2-b8fa-9193325b23a7-kube-api-access-pkmzl\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.421527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.421511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522601 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb8166d-889c-49d2-b8fa-9193325b23a7-metrics-client-ca\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522601 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522736 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkmzl\" (UniqueName: \"kubernetes.io/projected/6eb8166d-889c-49d2-b8fa-9193325b23a7-kube-api-access-pkmzl\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.522852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.523089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.522874 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-grpc-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.523857 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.523826 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb8166d-889c-49d2-b8fa-9193325b23a7-metrics-client-ca\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.525426 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.525400 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.525630 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.525610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.526003 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.525983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.526128 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.526112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.526261 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.526244 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-grpc-tls\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.526449 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.526430 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/6eb8166d-889c-49d2-b8fa-9193325b23a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.531836 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.531814 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkmzl\" (UniqueName: \"kubernetes.io/projected/6eb8166d-889c-49d2-b8fa-9193325b23a7-kube-api-access-pkmzl\") pod \"thanos-querier-5c88dc86bd-p22fx\" (UID: \"6eb8166d-889c-49d2-b8fa-9193325b23a7\") " pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.617882 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.617796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:28.741973 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:28.739608 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c88dc86bd-p22fx"] Apr 24 21:28:28.745331 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:28.745300 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb8166d_889c_49d2_b8fa_9193325b23a7.slice/crio-0d5af9827d4c261f4a45122a009dee2f92a9cfbbaf9cdf292e99febfc6bbf3c0 WatchSource:0}: Error finding container 0d5af9827d4c261f4a45122a009dee2f92a9cfbbaf9cdf292e99febfc6bbf3c0: Status 404 returned error can't find the container with id 0d5af9827d4c261f4a45122a009dee2f92a9cfbbaf9cdf292e99febfc6bbf3c0 Apr 24 21:28:29.671293 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.671239 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"0d5af9827d4c261f4a45122a009dee2f92a9cfbbaf9cdf292e99febfc6bbf3c0"} Apr 24 21:28:29.950590 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.950517 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb"] Apr 24 21:28:29.954235 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.954207 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:29.956328 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956205 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sl8t7\"" Apr 24 21:28:29.956630 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:28:29.956728 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956693 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9tdlaf8rmkb67\"" Apr 24 21:28:29.956785 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956737 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:28:29.956884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956867 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:28:29.956971 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.956885 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:28:29.964546 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:29.964521 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb"] Apr 24 21:28:30.035581 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-tls\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035756 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035601 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035756 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035631 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-metrics-server-audit-profiles\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035756 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-client-certs\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035756 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035721 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn7jt\" (UniqueName: \"kubernetes.io/projected/fd000c3f-ff04-44bb-ab20-f064e13433e8-kube-api-access-nn7jt\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd000c3f-ff04-44bb-ab20-f064e13433e8-audit-log\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.035916 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.035855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-client-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.136697 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.136658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-client-certs\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.136697 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.136700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn7jt\" (UniqueName: \"kubernetes.io/projected/fd000c3f-ff04-44bb-ab20-f064e13433e8-kube-api-access-nn7jt\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.136935 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.136750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd000c3f-ff04-44bb-ab20-f064e13433e8-audit-log\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137027 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.136922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-client-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137087 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.137029 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-tls\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137087 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.137078 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137185 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.137112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-metrics-server-audit-profiles\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137234 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.137207 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fd000c3f-ff04-44bb-ab20-f064e13433e8-audit-log\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.137845 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.137815 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.138193 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.138119 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fd000c3f-ff04-44bb-ab20-f064e13433e8-metrics-server-audit-profiles\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.139819 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.139794 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-tls\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.139920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.139889 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-secret-metrics-server-client-certs\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.140001 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.139930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd000c3f-ff04-44bb-ab20-f064e13433e8-client-ca-bundle\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.148370 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.148345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn7jt\" (UniqueName: \"kubernetes.io/projected/fd000c3f-ff04-44bb-ab20-f064e13433e8-kube-api-access-nn7jt\") pod \"metrics-server-5d4ff5bf67-dwkdb\" (UID: \"fd000c3f-ff04-44bb-ab20-f064e13433e8\") " pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.267348 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.267263 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:30.551545 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.551300 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb"] Apr 24 21:28:30.568042 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:30.568007 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd000c3f_ff04_44bb_ab20_f064e13433e8.slice/crio-6c737dfbc67feb85e76d314aaaf6f8ae9b0f15abfa449b8bf60461ab4fe765af WatchSource:0}: Error finding container 6c737dfbc67feb85e76d314aaaf6f8ae9b0f15abfa449b8bf60461ab4fe765af: Status 404 returned error can't find the container with id 6c737dfbc67feb85e76d314aaaf6f8ae9b0f15abfa449b8bf60461ab4fe765af Apr 24 21:28:30.642379 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.642351 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m"] Apr 24 21:28:30.647162 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.647143 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.649027 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649006 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 21:28:30.649220 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649131 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 21:28:30.649220 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649132 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 21:28:30.649220 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649175 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5czb8\"" Apr 24 21:28:30.649220 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649162 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 21:28:30.649585 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.649538 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 21:28:30.654506 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.654486 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 21:28:30.658633 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.658615 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m"] Apr 24 21:28:30.675803 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.675775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" event={"ID":"fd000c3f-ff04-44bb-ab20-f064e13433e8","Type":"ContainerStarted","Data":"6c737dfbc67feb85e76d314aaaf6f8ae9b0f15abfa449b8bf60461ab4fe765af"} Apr 24 21:28:30.677835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.677809 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"2f27ed7b9d4502f28450182dd9e1762618678d099314722e4e5a27553dff7de7"} Apr 24 21:28:30.677835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.677837 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"018cd542603f151f8b8d20bb7c185b81ca56299e55b658a7d325dff2abb7f375"} Apr 24 21:28:30.677835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.677846 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"744c20a79aef355538cac2c683091fc5c842ff285f21d5a488554eb64824cfaf"} Apr 24 21:28:30.745830 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.745779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-metrics-client-ca\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746279 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746365 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746316 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-federate-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746411 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746373 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746411 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746405 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746485 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746429 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746485 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746462 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-serving-certs-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.746554 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.746512 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4bt\" (UniqueName: \"kubernetes.io/projected/dc45bc44-5f31-4395-9780-dbb505c72767-kube-api-access-8j4bt\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847088 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.846990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847088 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847088 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847082 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847126 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-serving-certs-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4bt\" (UniqueName: \"kubernetes.io/projected/dc45bc44-5f31-4395-9780-dbb505c72767-kube-api-access-8j4bt\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-metrics-client-ca\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847272 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-federate-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.847974 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-serving-certs-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.848075 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.847980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-metrics-client-ca\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.848514 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.848489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.849804 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.849778 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.849892 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.849853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-telemeter-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.850000 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.849981 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-federate-client-tls\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.850121 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.850102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dc45bc44-5f31-4395-9780-dbb505c72767-secret-telemeter-client\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.855778 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.855760 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4bt\" (UniqueName: \"kubernetes.io/projected/dc45bc44-5f31-4395-9780-dbb505c72767-kube-api-access-8j4bt\") pod \"telemeter-client-5597f7ddb8-xbn5m\" (UID: \"dc45bc44-5f31-4395-9780-dbb505c72767\") " pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:30.957272 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:30.957234 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" Apr 24 21:28:31.101297 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.101178 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m"] Apr 24 21:28:31.106469 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:31.106436 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc45bc44_5f31_4395_9780_dbb505c72767.slice/crio-09c9e30daddff3d3451aa7cebffb9c5e91821d2216e4b81ac6f80073c3618dad WatchSource:0}: Error finding container 09c9e30daddff3d3451aa7cebffb9c5e91821d2216e4b81ac6f80073c3618dad: Status 404 returned error can't find the container with id 09c9e30daddff3d3451aa7cebffb9c5e91821d2216e4b81ac6f80073c3618dad Apr 24 21:28:31.151004 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.150944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:31.153255 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.153019 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:31.164367 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.164335 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f38edd3-fb52-42bc-b164-d84e78cffcc0-metrics-certs\") pod \"network-metrics-daemon-5b7z5\" (UID: \"9f38edd3-fb52-42bc-b164-d84e78cffcc0\") " pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:31.191466 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.191434 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:28:31.200034 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.200008 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5b7z5" Apr 24 21:28:31.252435 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.252394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:31.254426 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.254398 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:31.264705 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.264680 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:31.276032 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.276006 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkbt\" (UniqueName: \"kubernetes.io/projected/7799825c-48bb-465a-adc5-2d2c43a525df-kube-api-access-xjkbt\") pod \"network-check-target-kwjx6\" (UID: \"7799825c-48bb-465a-adc5-2d2c43a525df\") " pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:31.470455 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.470423 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5b7z5"] Apr 24 21:28:31.487118 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.487092 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xwzc6\"" Apr 24 21:28:31.495583 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.495538 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:31.643664 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.643581 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hwj8d" Apr 24 21:28:31.685084 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.685052 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"f3d9c50ab49b1216ded5f01586ec2eaaba75c0afc35b3f6ba8e0992227d40198"} Apr 24 21:28:31.686257 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.686229 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" event={"ID":"dc45bc44-5f31-4395-9780-dbb505c72767","Type":"ContainerStarted","Data":"09c9e30daddff3d3451aa7cebffb9c5e91821d2216e4b81ac6f80073c3618dad"} Apr 24 21:28:31.801428 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:31.801397 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f38edd3_fb52_42bc_b164_d84e78cffcc0.slice/crio-17749e5c227bc47668ff49a2572de2399436bf952bc1cdf51cc390d8245206c8 WatchSource:0}: Error finding container 17749e5c227bc47668ff49a2572de2399436bf952bc1cdf51cc390d8245206c8: Status 404 returned error can't find the container with id 17749e5c227bc47668ff49a2572de2399436bf952bc1cdf51cc390d8245206c8 Apr 24 21:28:31.929764 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:31.929738 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kwjx6"] Apr 24 21:28:31.932945 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:31.932914 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7799825c_48bb_465a_adc5_2d2c43a525df.slice/crio-78dcf466ba3299f247f2898b0133ec3421085b57c5652a902a6aa207f38fd2bb WatchSource:0}: Error finding container 78dcf466ba3299f247f2898b0133ec3421085b57c5652a902a6aa207f38fd2bb: Status 404 returned error can't find the container with id 78dcf466ba3299f247f2898b0133ec3421085b57c5652a902a6aa207f38fd2bb Apr 24 21:28:32.691576 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.691514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" event={"ID":"fd000c3f-ff04-44bb-ab20-f064e13433e8","Type":"ContainerStarted","Data":"d73f3e0e9746e44b8bda3ebea7a5bfd9c3a25b211f641579cc16fa3bbad7a640"} Apr 24 21:28:32.693004 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.692943 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5b7z5" event={"ID":"9f38edd3-fb52-42bc-b164-d84e78cffcc0","Type":"ContainerStarted","Data":"17749e5c227bc47668ff49a2572de2399436bf952bc1cdf51cc390d8245206c8"} Apr 24 21:28:32.696483 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.696386 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"9cabadbc0f5519a5987f398826cc7a2327c21785459b53dfe95357187cc9fc06"} Apr 24 21:28:32.696483 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.696415 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" event={"ID":"6eb8166d-889c-49d2-b8fa-9193325b23a7","Type":"ContainerStarted","Data":"6ac7abe8ed856ec68cc5b7ee5a019c689da3b090a48f2efef946c7eab6c8339f"} Apr 24 21:28:32.696651 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.696603 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:32.697710 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.697689 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kwjx6" event={"ID":"7799825c-48bb-465a-adc5-2d2c43a525df","Type":"ContainerStarted","Data":"78dcf466ba3299f247f2898b0133ec3421085b57c5652a902a6aa207f38fd2bb"} Apr 24 21:28:32.712883 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.712827 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" podStartSLOduration=2.432841469 podStartE2EDuration="3.712813229s" podCreationTimestamp="2026-04-24 21:28:29 +0000 UTC" firstStartedPulling="2026-04-24 21:28:30.570254274 +0000 UTC m=+65.755542593" lastFinishedPulling="2026-04-24 21:28:31.850226009 +0000 UTC m=+67.035514353" observedRunningTime="2026-04-24 21:28:32.711983468 +0000 UTC m=+67.897271808" watchObservedRunningTime="2026-04-24 21:28:32.712813229 +0000 UTC m=+67.898101568" Apr 24 21:28:32.735227 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:32.735129 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" podStartSLOduration=2.094771283 podStartE2EDuration="4.73511168s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:28.747579599 +0000 UTC m=+63.932867916" lastFinishedPulling="2026-04-24 21:28:31.387919996 +0000 UTC m=+66.573208313" observedRunningTime="2026-04-24 21:28:32.733753953 +0000 UTC m=+67.919042304" watchObservedRunningTime="2026-04-24 21:28:32.73511168 +0000 UTC m=+67.920400020" Apr 24 21:28:33.703165 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:33.703135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" event={"ID":"dc45bc44-5f31-4395-9780-dbb505c72767","Type":"ContainerStarted","Data":"159e6fc26d9d78349704fdb5679c3fc464f0d161b12ae4613b46adfeb17afa25"} Apr 24 21:28:33.705411 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:33.705266 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5b7z5" event={"ID":"9f38edd3-fb52-42bc-b164-d84e78cffcc0","Type":"ContainerStarted","Data":"d64c4255acbc9a7023c92cb7dbe368fb79d6d5e8f46716896033292d701f7197"} Apr 24 21:28:34.711732 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:34.711688 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5b7z5" event={"ID":"9f38edd3-fb52-42bc-b164-d84e78cffcc0","Type":"ContainerStarted","Data":"7b657b60047fbd9897eb219601f4b364962e0c0a06b9d451178fa76655205c0a"} Apr 24 21:28:34.727613 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:34.727407 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5b7z5" podStartSLOduration=68.095227002 podStartE2EDuration="1m9.727389655s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:31.803981562 +0000 UTC m=+66.989269902" lastFinishedPulling="2026-04-24 21:28:33.436144225 +0000 UTC m=+68.621432555" observedRunningTime="2026-04-24 21:28:34.726687384 +0000 UTC m=+69.911975748" watchObservedRunningTime="2026-04-24 21:28:34.727389655 +0000 UTC m=+69.912677995" Apr 24 21:28:35.425574 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.425544 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:28:35.450343 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.450307 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:28:35.450492 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.450446 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.452370 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452345 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:28:35.452515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452403 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:28:35.452515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452422 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:35.452515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452436 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nqhcp\"" Apr 24 21:28:35.452515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452497 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:35.452515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452436 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:28:35.452821 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.452671 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:28:35.453099 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.453081 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:28:35.457130 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.457110 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:28:35.489872 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.489843 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.489876 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.489896 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.489986 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490130 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.490027 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490130 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.490046 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w4gl\" (UniqueName: \"kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.490130 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.490101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590602 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w4gl\" (UniqueName: \"kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590754 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590754 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590754 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590754 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.590944 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.590759 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.591510 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.591488 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.591617 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.591503 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.591617 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.591605 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.591727 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.591613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.593209 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.593183 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.593286 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.593250 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.600248 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.600225 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w4gl\" (UniqueName: \"kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl\") pod \"console-5d55bd964c-9dkxz\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.759683 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.759657 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:35.881607 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:35.881564 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:28:35.884128 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:35.884008 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef9f8be_8fc5_4474_8fb2_b13404c290cc.slice/crio-393aba2a19fc1d55b0a4926084948d2f999dada56396b5eaaad126bf145da2c5 WatchSource:0}: Error finding container 393aba2a19fc1d55b0a4926084948d2f999dada56396b5eaaad126bf145da2c5: Status 404 returned error can't find the container with id 393aba2a19fc1d55b0a4926084948d2f999dada56396b5eaaad126bf145da2c5 Apr 24 21:28:36.719115 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.719071 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kwjx6" event={"ID":"7799825c-48bb-465a-adc5-2d2c43a525df","Type":"ContainerStarted","Data":"4bd36a81ae889bcf4825b2ac76864776886b87cad1c53871f1e1ec3abaf47b09"} Apr 24 21:28:36.719316 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.719181 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:28:36.721246 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.721213 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" event={"ID":"dc45bc44-5f31-4395-9780-dbb505c72767","Type":"ContainerStarted","Data":"260dd638b204a912f4f68ef6774979a59d1efbf3d5186aab0cfb4554298afe80"} Apr 24 21:28:36.721381 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.721251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" event={"ID":"dc45bc44-5f31-4395-9780-dbb505c72767","Type":"ContainerStarted","Data":"dd34acfabb8172f969f608e647431a8a68e750638f5fc63426a4e91734ed3854"} Apr 24 21:28:36.722482 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.722447 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d55bd964c-9dkxz" event={"ID":"eef9f8be-8fc5-4474-8fb2-b13404c290cc","Type":"ContainerStarted","Data":"393aba2a19fc1d55b0a4926084948d2f999dada56396b5eaaad126bf145da2c5"} Apr 24 21:28:36.752498 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.752436 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kwjx6" podStartSLOduration=68.091414329 podStartE2EDuration="1m11.752420366s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:31.935038549 +0000 UTC m=+67.120326866" lastFinishedPulling="2026-04-24 21:28:35.596044573 +0000 UTC m=+70.781332903" observedRunningTime="2026-04-24 21:28:36.733796485 +0000 UTC m=+71.919084825" watchObservedRunningTime="2026-04-24 21:28:36.752420366 +0000 UTC m=+71.937708707" Apr 24 21:28:36.752746 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:36.752621 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5597f7ddb8-xbn5m" podStartSLOduration=2.263828314 podStartE2EDuration="6.752611695s" podCreationTimestamp="2026-04-24 21:28:30 +0000 UTC" firstStartedPulling="2026-04-24 21:28:31.108986724 +0000 UTC m=+66.294275062" lastFinishedPulling="2026-04-24 21:28:35.597770108 +0000 UTC m=+70.783058443" observedRunningTime="2026-04-24 21:28:36.752172705 +0000 UTC m=+71.937461056" watchObservedRunningTime="2026-04-24 21:28:36.752611695 +0000 UTC m=+71.937900036" Apr 24 21:28:38.712324 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:38.712296 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c88dc86bd-p22fx" Apr 24 21:28:38.729194 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:38.729165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d55bd964c-9dkxz" event={"ID":"eef9f8be-8fc5-4474-8fb2-b13404c290cc","Type":"ContainerStarted","Data":"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb"} Apr 24 21:28:38.755265 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:38.755157 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d55bd964c-9dkxz" podStartSLOduration=1.1575467 podStartE2EDuration="3.755136682s" podCreationTimestamp="2026-04-24 21:28:35 +0000 UTC" firstStartedPulling="2026-04-24 21:28:35.886025213 +0000 UTC m=+71.071313530" lastFinishedPulling="2026-04-24 21:28:38.483615195 +0000 UTC m=+73.668903512" observedRunningTime="2026-04-24 21:28:38.754242678 +0000 UTC m=+73.939531017" watchObservedRunningTime="2026-04-24 21:28:38.755136682 +0000 UTC m=+73.940425023" Apr 24 21:28:38.970772 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:38.970732 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:28:39.004682 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.004647 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:28:39.008032 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.007991 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.025460 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.025432 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:28:39.117118 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117132 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjsdm\" (UniqueName: \"kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117174 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117217 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.117311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.117255 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217721 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217753 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217789 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjsdm\" (UniqueName: \"kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.217920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.217863 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.218591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.218564 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.218591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.218583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.218908 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.218882 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.219114 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.219089 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.220451 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.220417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.220515 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.220476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.227896 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.227876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjsdm\" (UniqueName: \"kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm\") pod \"console-5c4c7d4fb6-464cl\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.316688 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.316607 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:39.435329 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.435285 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:28:39.438338 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:28:39.438309 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2903f619_7a28_4b5f_9e0e_80b978cb38db.slice/crio-89eacf4b599a45ff2535668bf49d498b19c6451c7ce21ca97a34c132c8e6a55a WatchSource:0}: Error finding container 89eacf4b599a45ff2535668bf49d498b19c6451c7ce21ca97a34c132c8e6a55a: Status 404 returned error can't find the container with id 89eacf4b599a45ff2535668bf49d498b19c6451c7ce21ca97a34c132c8e6a55a Apr 24 21:28:39.632400 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.632319 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-847fd97c69-9sv27" Apr 24 21:28:39.733136 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.733099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c7d4fb6-464cl" event={"ID":"2903f619-7a28-4b5f-9e0e-80b978cb38db","Type":"ContainerStarted","Data":"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a"} Apr 24 21:28:39.733136 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.733135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c7d4fb6-464cl" event={"ID":"2903f619-7a28-4b5f-9e0e-80b978cb38db","Type":"ContainerStarted","Data":"89eacf4b599a45ff2535668bf49d498b19c6451c7ce21ca97a34c132c8e6a55a"} Apr 24 21:28:39.761745 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:39.761701 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c4c7d4fb6-464cl" podStartSLOduration=1.7616857339999998 podStartE2EDuration="1.761685734s" podCreationTimestamp="2026-04-24 21:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:39.760334282 +0000 UTC m=+74.945622621" watchObservedRunningTime="2026-04-24 21:28:39.761685734 +0000 UTC m=+74.946974073" Apr 24 21:28:45.760778 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:45.760702 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:28:49.317309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:49.317253 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:49.317309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:49.317319 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:49.322217 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:49.322193 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:49.769224 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:49.769196 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:28:50.268046 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:50.268003 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:28:50.268218 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:28:50.268060 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:29:05.754457 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:05.754394 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d55bd964c-9dkxz" podUID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" containerName="console" containerID="cri-o://b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb" gracePeriod=15 Apr 24 21:29:05.999485 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:05.999464 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d55bd964c-9dkxz_eef9f8be-8fc5-4474-8fb2-b13404c290cc/console/0.log" Apr 24 21:29:05.999592 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:05.999534 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:29:06.144721 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144632 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144721 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144709 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144729 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w4gl\" (UniqueName: \"kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144824 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144881 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144917 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.144976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.144951 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca\") pod \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\" (UID: \"eef9f8be-8fc5-4474-8fb2-b13404c290cc\") " Apr 24 21:29:06.145245 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.145210 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.145360 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.145327 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.145547 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.145378 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config" (OuterVolumeSpecName: "console-config") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.145547 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.145455 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca" (OuterVolumeSpecName: "service-ca") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:06.146994 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.146951 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:06.147078 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.147038 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl" (OuterVolumeSpecName: "kube-api-access-2w4gl") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "kube-api-access-2w4gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:06.147127 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.147067 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eef9f8be-8fc5-4474-8fb2-b13404c290cc" (UID: "eef9f8be-8fc5-4474-8fb2-b13404c290cc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:06.245860 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245835 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-oauth-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.245860 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245857 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w4gl\" (UniqueName: \"kubernetes.io/projected/eef9f8be-8fc5-4474-8fb2-b13404c290cc-kube-api-access-2w4gl\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.245860 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245866 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-trusted-ca-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.246072 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245876 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-oauth-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.246072 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245885 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.246072 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245894 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eef9f8be-8fc5-4474-8fb2-b13404c290cc-service-ca\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.246072 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.245902 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eef9f8be-8fc5-4474-8fb2-b13404c290cc-console-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:29:06.810765 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810736 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d55bd964c-9dkxz_eef9f8be-8fc5-4474-8fb2-b13404c290cc/console/0.log" Apr 24 21:29:06.811231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810777 2566 generic.go:358] "Generic (PLEG): container finished" podID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" containerID="b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb" exitCode=2 Apr 24 21:29:06.811231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810842 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d55bd964c-9dkxz" event={"ID":"eef9f8be-8fc5-4474-8fb2-b13404c290cc","Type":"ContainerDied","Data":"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb"} Apr 24 21:29:06.811231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810844 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d55bd964c-9dkxz" Apr 24 21:29:06.811231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810865 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d55bd964c-9dkxz" event={"ID":"eef9f8be-8fc5-4474-8fb2-b13404c290cc","Type":"ContainerDied","Data":"393aba2a19fc1d55b0a4926084948d2f999dada56396b5eaaad126bf145da2c5"} Apr 24 21:29:06.811231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.810880 2566 scope.go:117] "RemoveContainer" containerID="b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb" Apr 24 21:29:06.819311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.819292 2566 scope.go:117] "RemoveContainer" containerID="b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb" Apr 24 21:29:06.819602 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:29:06.819578 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb\": container with ID starting with b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb not found: ID does not exist" containerID="b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb" Apr 24 21:29:06.819680 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.819606 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb"} err="failed to get container status \"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb\": rpc error: code = NotFound desc = could not find container \"b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb\": container with ID starting with b0c2a44355b0a23697556c14f1e1538298a26c11bc474ceb3f22c8e5a23fbedb not found: ID does not exist" Apr 24 21:29:06.849825 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.849796 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:29:06.860731 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:06.860713 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d55bd964c-9dkxz"] Apr 24 21:29:07.380545 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:07.380510 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" path="/var/lib/kubelet/pods/eef9f8be-8fc5-4474-8fb2-b13404c290cc/volumes" Apr 24 21:29:07.728240 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:07.728211 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kwjx6" Apr 24 21:29:10.272607 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:10.272568 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:29:10.276391 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:10.276365 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5d4ff5bf67-dwkdb" Apr 24 21:29:46.557577 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.557543 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:29:46.558163 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.557916 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" containerName="console" Apr 24 21:29:46.558163 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.557933 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" containerName="console" Apr 24 21:29:46.558163 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.558041 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="eef9f8be-8fc5-4474-8fb2-b13404c290cc" containerName="console" Apr 24 21:29:46.560869 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.560849 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.573139 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.573116 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:29:46.620777 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620752 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620879 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620879 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620804 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9l77\" (UniqueName: \"kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620985 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620881 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620985 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620913 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620985 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620936 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.620985 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.620972 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722098 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722056 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722098 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722100 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722131 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9l77\" (UniqueName: \"kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722304 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722927 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.722927 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.722921 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.723100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.723041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.723100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.723071 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.724600 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.724574 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.724787 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.724765 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.732102 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.732075 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9l77\" (UniqueName: \"kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77\") pod \"console-dd9f5dbfb-btccq\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:46.868884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:46.868797 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:47.020184 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:47.020149 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:29:47.022582 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:29:47.022558 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215db3af_4b00_43ad_887d_b028bf9befeb.slice/crio-2342ffa6c4c4178b69c60743f741423e950511757e48bb7d4dabb01768081392 WatchSource:0}: Error finding container 2342ffa6c4c4178b69c60743f741423e950511757e48bb7d4dabb01768081392: Status 404 returned error can't find the container with id 2342ffa6c4c4178b69c60743f741423e950511757e48bb7d4dabb01768081392 Apr 24 21:29:47.946443 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:47.946404 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd9f5dbfb-btccq" event={"ID":"215db3af-4b00-43ad-887d-b028bf9befeb","Type":"ContainerStarted","Data":"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b"} Apr 24 21:29:47.946443 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:47.946445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd9f5dbfb-btccq" event={"ID":"215db3af-4b00-43ad-887d-b028bf9befeb","Type":"ContainerStarted","Data":"2342ffa6c4c4178b69c60743f741423e950511757e48bb7d4dabb01768081392"} Apr 24 21:29:47.968110 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:47.968056 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dd9f5dbfb-btccq" podStartSLOduration=1.9680394890000001 podStartE2EDuration="1.968039489s" podCreationTimestamp="2026-04-24 21:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:47.967211853 +0000 UTC m=+143.152500262" watchObservedRunningTime="2026-04-24 21:29:47.968039489 +0000 UTC m=+143.153327829" Apr 24 21:29:56.869232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:56.869179 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:56.869232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:56.869247 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:56.874094 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:56.874069 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:56.974408 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:56.974384 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:29:57.051972 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:29:57.051925 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:30:22.072563 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.072483 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c4c7d4fb6-464cl" podUID="2903f619-7a28-4b5f-9e0e-80b978cb38db" containerName="console" containerID="cri-o://1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a" gracePeriod=15 Apr 24 21:30:22.297328 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.297307 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4c7d4fb6-464cl_2903f619-7a28-4b5f-9e0e-80b978cb38db/console/0.log" Apr 24 21:30:22.297439 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.297366 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:30:22.383496 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383433 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383496 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383471 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjsdm\" (UniqueName: \"kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383650 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383500 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383650 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383535 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383650 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383558 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383650 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383578 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.383650 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.383617 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle\") pod \"2903f619-7a28-4b5f-9e0e-80b978cb38db\" (UID: \"2903f619-7a28-4b5f-9e0e-80b978cb38db\") " Apr 24 21:30:22.384132 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.384071 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca" (OuterVolumeSpecName: "service-ca") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:22.384132 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.384088 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:22.384132 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.384077 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config" (OuterVolumeSpecName: "console-config") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:22.384338 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.384145 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:22.385555 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.385522 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:22.385634 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.385586 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm" (OuterVolumeSpecName: "kube-api-access-xjsdm") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "kube-api-access-xjsdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:22.385634 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.385593 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2903f619-7a28-4b5f-9e0e-80b978cb38db" (UID: "2903f619-7a28-4b5f-9e0e-80b978cb38db"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:22.484231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484199 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-oauth-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484229 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjsdm\" (UniqueName: \"kubernetes.io/projected/2903f619-7a28-4b5f-9e0e-80b978cb38db-kube-api-access-xjsdm\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484239 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484250 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-console-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484258 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-service-ca\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484266 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-oauth-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:22.484377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:22.484274 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2903f619-7a28-4b5f-9e0e-80b978cb38db-trusted-ca-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:30:23.037681 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037655 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4c7d4fb6-464cl_2903f619-7a28-4b5f-9e0e-80b978cb38db/console/0.log" Apr 24 21:30:23.037850 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037693 2566 generic.go:358] "Generic (PLEG): container finished" podID="2903f619-7a28-4b5f-9e0e-80b978cb38db" containerID="1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a" exitCode=2 Apr 24 21:30:23.037850 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037761 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c7d4fb6-464cl" Apr 24 21:30:23.037850 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037773 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c7d4fb6-464cl" event={"ID":"2903f619-7a28-4b5f-9e0e-80b978cb38db","Type":"ContainerDied","Data":"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a"} Apr 24 21:30:23.037850 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037800 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c7d4fb6-464cl" event={"ID":"2903f619-7a28-4b5f-9e0e-80b978cb38db","Type":"ContainerDied","Data":"89eacf4b599a45ff2535668bf49d498b19c6451c7ce21ca97a34c132c8e6a55a"} Apr 24 21:30:23.037850 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.037814 2566 scope.go:117] "RemoveContainer" containerID="1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a" Apr 24 21:30:23.045497 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.045478 2566 scope.go:117] "RemoveContainer" containerID="1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a" Apr 24 21:30:23.045733 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:30:23.045714 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a\": container with ID starting with 1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a not found: ID does not exist" containerID="1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a" Apr 24 21:30:23.045801 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.045743 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a"} err="failed to get container status \"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a\": rpc error: code = NotFound desc = could not find container \"1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a\": container with ID starting with 1f7dd791ec7f381a03b981b1ec54cc2e9367c345fe23e512443c338d86be0e2a not found: ID does not exist" Apr 24 21:30:23.058311 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.058292 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:30:23.062717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.062697 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c4c7d4fb6-464cl"] Apr 24 21:30:23.380003 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:23.379911 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2903f619-7a28-4b5f-9e0e-80b978cb38db" path="/var/lib/kubelet/pods/2903f619-7a28-4b5f-9e0e-80b978cb38db/volumes" Apr 24 21:30:30.673885 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.673850 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-46s9w"] Apr 24 21:30:30.674289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.674172 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2903f619-7a28-4b5f-9e0e-80b978cb38db" containerName="console" Apr 24 21:30:30.674289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.674187 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2903f619-7a28-4b5f-9e0e-80b978cb38db" containerName="console" Apr 24 21:30:30.674289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.674232 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2903f619-7a28-4b5f-9e0e-80b978cb38db" containerName="console" Apr 24 21:30:30.678675 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.678657 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.680641 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.680621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:30:30.685056 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.685028 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-46s9w"] Apr 24 21:30:30.849861 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.849822 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-kubelet-config\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.850065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.849930 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-dbus\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.850065 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.849988 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-original-pull-secret\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.951213 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.951108 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-kubelet-config\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.951362 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.951223 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-dbus\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.951362 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.951240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-kubelet-config\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.951362 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.951253 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-original-pull-secret\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.951461 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.951413 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-dbus\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.953560 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.953529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd-original-pull-secret\") pod \"global-pull-secret-syncer-46s9w\" (UID: \"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd\") " pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:30.988621 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:30.988593 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-46s9w" Apr 24 21:30:31.107781 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:31.107750 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-46s9w"] Apr 24 21:30:31.110294 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:30:31.110266 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5deb9c1a_4ef3_4e20_a3b3_5b3fb0ee1cfd.slice/crio-2202f2dff84de2e3cc78a1d676c37bf80c4a91f102fdf6c8f3f7001cb5de1d3b WatchSource:0}: Error finding container 2202f2dff84de2e3cc78a1d676c37bf80c4a91f102fdf6c8f3f7001cb5de1d3b: Status 404 returned error can't find the container with id 2202f2dff84de2e3cc78a1d676c37bf80c4a91f102fdf6c8f3f7001cb5de1d3b Apr 24 21:30:32.065448 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:32.065414 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-46s9w" event={"ID":"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd","Type":"ContainerStarted","Data":"2202f2dff84de2e3cc78a1d676c37bf80c4a91f102fdf6c8f3f7001cb5de1d3b"} Apr 24 21:30:35.074929 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:35.074892 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-46s9w" event={"ID":"5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd","Type":"ContainerStarted","Data":"fa47db350a4f0259f49247b2d563651e66a46019b401a5abf171ddcdbb4f7ef7"} Apr 24 21:30:35.090965 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:35.090921 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-46s9w" podStartSLOduration=1.693765828 podStartE2EDuration="5.090908494s" podCreationTimestamp="2026-04-24 21:30:30 +0000 UTC" firstStartedPulling="2026-04-24 21:30:31.111898369 +0000 UTC m=+186.297186701" lastFinishedPulling="2026-04-24 21:30:34.509041045 +0000 UTC m=+189.694329367" observedRunningTime="2026-04-24 21:30:35.090694308 +0000 UTC m=+190.275982647" watchObservedRunningTime="2026-04-24 21:30:35.090908494 +0000 UTC m=+190.276196834" Apr 24 21:30:49.344927 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.344897 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58"] Apr 24 21:30:49.349144 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.349096 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.351444 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.351423 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vl7jj\"" Apr 24 21:30:49.351669 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.351653 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:30:49.351804 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.351794 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:30:49.355879 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.355859 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58"] Apr 24 21:30:49.394546 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.394522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5f5s\" (UniqueName: \"kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.394642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.394561 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.394642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.394591 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.495668 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.495643 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5f5s\" (UniqueName: \"kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.495766 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.495681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.495766 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.495709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.496077 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.496062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.496118 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.496064 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.506630 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.506604 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5f5s\" (UniqueName: \"kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.659378 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.659294 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:30:49.774609 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:49.774580 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58"] Apr 24 21:30:49.777699 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:30:49.777664 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b87aa1_7c67_4ff1_b12b_99259528b6e5.slice/crio-608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1 WatchSource:0}: Error finding container 608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1: Status 404 returned error can't find the container with id 608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1 Apr 24 21:30:50.117410 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:50.117377 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" event={"ID":"93b87aa1-7c67-4ff1-b12b-99259528b6e5","Type":"ContainerStarted","Data":"608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1"} Apr 24 21:30:55.132332 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:55.132295 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerID="44dc7c5c55897d31f009ad8694200d88743e01cf13ae39909c5c40cbe270d1a6" exitCode=0 Apr 24 21:30:55.132712 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:55.132345 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" event={"ID":"93b87aa1-7c67-4ff1-b12b-99259528b6e5","Type":"ContainerDied","Data":"44dc7c5c55897d31f009ad8694200d88743e01cf13ae39909c5c40cbe270d1a6"} Apr 24 21:30:58.144079 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:58.144048 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerID="7688332c48cfda07161a601edc5141b463ff61654106fe385ae17425f19d4e59" exitCode=0 Apr 24 21:30:58.144432 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:30:58.144116 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" event={"ID":"93b87aa1-7c67-4ff1-b12b-99259528b6e5","Type":"ContainerDied","Data":"7688332c48cfda07161a601edc5141b463ff61654106fe385ae17425f19d4e59"} Apr 24 21:31:04.164132 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:04.164035 2566 generic.go:358] "Generic (PLEG): container finished" podID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerID="d1d02c2a66f1d0a3027576cc513c63a308595d119dfe7886ae27b6757ce27f51" exitCode=0 Apr 24 21:31:04.164132 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:04.164090 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" event={"ID":"93b87aa1-7c67-4ff1-b12b-99259528b6e5","Type":"ContainerDied","Data":"d1d02c2a66f1d0a3027576cc513c63a308595d119dfe7886ae27b6757ce27f51"} Apr 24 21:31:05.281945 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.281924 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:31:05.326015 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.325994 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5f5s\" (UniqueName: \"kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s\") pod \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " Apr 24 21:31:05.326160 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.326061 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle\") pod \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " Apr 24 21:31:05.326160 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.326082 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util\") pod \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\" (UID: \"93b87aa1-7c67-4ff1-b12b-99259528b6e5\") " Apr 24 21:31:05.326596 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.326569 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle" (OuterVolumeSpecName: "bundle") pod "93b87aa1-7c67-4ff1-b12b-99259528b6e5" (UID: "93b87aa1-7c67-4ff1-b12b-99259528b6e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:05.328050 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.328020 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s" (OuterVolumeSpecName: "kube-api-access-t5f5s") pod "93b87aa1-7c67-4ff1-b12b-99259528b6e5" (UID: "93b87aa1-7c67-4ff1-b12b-99259528b6e5"). InnerVolumeSpecName "kube-api-access-t5f5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:05.330358 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.330337 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util" (OuterVolumeSpecName: "util") pod "93b87aa1-7c67-4ff1-b12b-99259528b6e5" (UID: "93b87aa1-7c67-4ff1-b12b-99259528b6e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:05.427310 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.427262 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5f5s\" (UniqueName: \"kubernetes.io/projected/93b87aa1-7c67-4ff1-b12b-99259528b6e5-kube-api-access-t5f5s\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:31:05.427310 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.427287 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:31:05.427310 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:05.427301 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93b87aa1-7c67-4ff1-b12b-99259528b6e5-util\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:31:06.170874 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:06.170836 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" event={"ID":"93b87aa1-7c67-4ff1-b12b-99259528b6e5","Type":"ContainerDied","Data":"608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1"} Apr 24 21:31:06.170874 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:06.170872 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="608e0c656b28726f23ce03ab332bb1422bb209fe0902714ba660005c0a816ef1" Apr 24 21:31:06.171082 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:06.170899 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cb7n58" Apr 24 21:31:11.299165 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299136 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs"] Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299433 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="extract" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299449 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="extract" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299462 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="pull" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299468 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="pull" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299476 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="util" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299482 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="util" Apr 24 21:31:11.299588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.299533 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="93b87aa1-7c67-4ff1-b12b-99259528b6e5" containerName="extract" Apr 24 21:31:11.317815 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.317789 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs"] Apr 24 21:31:11.318006 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.317918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.320406 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.320378 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-sv7tr\"" Apr 24 21:31:11.320527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.320439 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:31:11.321468 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.321447 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:31:11.321468 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.321460 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:31:11.372856 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.372822 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctd9\" (UniqueName: \"kubernetes.io/projected/8ec20821-de71-4908-94c2-1482d13ca044-kube-api-access-hctd9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.373034 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.372869 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8ec20821-de71-4908-94c2-1482d13ca044-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.473598 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.473553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hctd9\" (UniqueName: \"kubernetes.io/projected/8ec20821-de71-4908-94c2-1482d13ca044-kube-api-access-hctd9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.473794 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.473645 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8ec20821-de71-4908-94c2-1482d13ca044-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.475936 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.475915 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8ec20821-de71-4908-94c2-1482d13ca044-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.482693 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.482668 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctd9\" (UniqueName: \"kubernetes.io/projected/8ec20821-de71-4908-94c2-1482d13ca044-kube-api-access-hctd9\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs\" (UID: \"8ec20821-de71-4908-94c2-1482d13ca044\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.628514 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.628423 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:11.762749 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:11.762718 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs"] Apr 24 21:31:11.766664 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:31:11.766638 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec20821_de71_4908_94c2_1482d13ca044.slice/crio-64c8665f6434afac2b74113c17aea79775c52973e8187e3a635b1c80c3ecec6b WatchSource:0}: Error finding container 64c8665f6434afac2b74113c17aea79775c52973e8187e3a635b1c80c3ecec6b: Status 404 returned error can't find the container with id 64c8665f6434afac2b74113c17aea79775c52973e8187e3a635b1c80c3ecec6b Apr 24 21:31:12.188492 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:12.188457 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" event={"ID":"8ec20821-de71-4908-94c2-1482d13ca044","Type":"ContainerStarted","Data":"64c8665f6434afac2b74113c17aea79775c52973e8187e3a635b1c80c3ecec6b"} Apr 24 21:31:16.031112 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.031075 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dmptg"] Apr 24 21:31:16.053586 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.053549 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dmptg"] Apr 24 21:31:16.053762 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.053691 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.055501 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.055476 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:31:16.055501 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.055479 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:31:16.055666 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.055526 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-j6mq8\"" Apr 24 21:31:16.113401 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.113362 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnfx\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-kube-api-access-tnnfx\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.113602 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.113472 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.113602 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.113538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-cabundle0\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.200506 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.200469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" event={"ID":"8ec20821-de71-4908-94c2-1482d13ca044","Type":"ContainerStarted","Data":"6d9cfc098af29291c86b8ad838474d8b4ff3a67d18785f33420fe61504686939"} Apr 24 21:31:16.200691 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.200613 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:16.214919 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.214889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnfx\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-kube-api-access-tnnfx\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.215054 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.214950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.215054 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.214991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-cabundle0\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.215054 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.215048 2566 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:31:16.215210 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.215067 2566 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:16.215210 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.215074 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:16.215210 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.215086 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dmptg: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:16.215210 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.215138 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates podName:a3fc9e80-e65f-425c-aae5-b6bbc3f9010b nodeName:}" failed. No retries permitted until 2026-04-24 21:31:16.71512227 +0000 UTC m=+231.900410590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates") pod "keda-operator-ffbb595cb-dmptg" (UID: "a3fc9e80-e65f-425c-aae5-b6bbc3f9010b") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:16.215599 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.215580 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-cabundle0\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.222148 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.222101 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" podStartSLOduration=1.647708235 podStartE2EDuration="5.222088546s" podCreationTimestamp="2026-04-24 21:31:11 +0000 UTC" firstStartedPulling="2026-04-24 21:31:11.768372087 +0000 UTC m=+226.953660403" lastFinishedPulling="2026-04-24 21:31:15.342752383 +0000 UTC m=+230.528040714" observedRunningTime="2026-04-24 21:31:16.220884321 +0000 UTC m=+231.406172660" watchObservedRunningTime="2026-04-24 21:31:16.222088546 +0000 UTC m=+231.407376882" Apr 24 21:31:16.226196 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.226171 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnfx\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-kube-api-access-tnnfx\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.401295 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.401204 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx"] Apr 24 21:31:16.422145 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.422116 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx"] Apr 24 21:31:16.422324 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.422233 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.424277 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.424252 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:31:16.517570 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.517530 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-496dh\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-kube-api-access-496dh\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.517745 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.517602 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5dfe510d-cf85-4044-a0be-d31e343e5953-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.517745 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.517632 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.618208 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.618174 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5dfe510d-cf85-4044-a0be-d31e343e5953-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.618208 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.618219 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.618461 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.618263 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-496dh\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-kube-api-access-496dh\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.618461 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.618380 2566 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:16.618461 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.618400 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:16.618461 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.618418 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx: references non-existent secret key: tls.crt Apr 24 21:31:16.618661 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.618477 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates podName:5dfe510d-cf85-4044-a0be-d31e343e5953 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.118458663 +0000 UTC m=+232.303746982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates") pod "keda-metrics-apiserver-7c9f485588-b4pkx" (UID: "5dfe510d-cf85-4044-a0be-d31e343e5953") : references non-existent secret key: tls.crt Apr 24 21:31:16.618661 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.618624 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5dfe510d-cf85-4044-a0be-d31e343e5953-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.630136 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.630112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-496dh\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-kube-api-access-496dh\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:16.718901 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.718864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:16.719107 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.719010 2566 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:16.719107 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.719026 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:16.719107 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.719035 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-dmptg: references non-existent secret key: ca.crt Apr 24 21:31:16.719107 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:31:16.719091 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates podName:a3fc9e80-e65f-425c-aae5-b6bbc3f9010b nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.719078081 +0000 UTC m=+232.904366401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates") pod "keda-operator-ffbb595cb-dmptg" (UID: "a3fc9e80-e65f-425c-aae5-b6bbc3f9010b") : references non-existent secret key: ca.crt Apr 24 21:31:16.786314 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.786273 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-vdbpw"] Apr 24 21:31:16.810987 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.810931 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vdbpw"] Apr 24 21:31:16.811169 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.811095 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:16.813092 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.813069 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:31:16.920313 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.920258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-certificates\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:16.920487 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:16.920329 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtp4\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-kube-api-access-7mtp4\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.020843 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.020751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-certificates\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.020843 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.020807 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtp4\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-kube-api-access-7mtp4\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.023231 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.023197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-certificates\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.031725 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.031699 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtp4\" (UniqueName: \"kubernetes.io/projected/6fcd0938-4651-4731-b26c-18ad10b88766-kube-api-access-7mtp4\") pod \"keda-admission-cf49989db-vdbpw\" (UID: \"6fcd0938-4651-4731-b26c-18ad10b88766\") " pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.121069 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.121026 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:17.121265 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.121242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:17.123686 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.123645 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5dfe510d-cf85-4044-a0be-d31e343e5953-certificates\") pod \"keda-metrics-apiserver-7c9f485588-b4pkx\" (UID: \"5dfe510d-cf85-4044-a0be-d31e343e5953\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:17.250884 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.250847 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vdbpw"] Apr 24 21:31:17.254666 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:31:17.254639 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fcd0938_4651_4731_b26c_18ad10b88766.slice/crio-73ee2c339b767e37656fe0ab5473ed9c3234de4619e6efeb129c50b698445b92 WatchSource:0}: Error finding container 73ee2c339b767e37656fe0ab5473ed9c3234de4619e6efeb129c50b698445b92: Status 404 returned error can't find the container with id 73ee2c339b767e37656fe0ab5473ed9c3234de4619e6efeb129c50b698445b92 Apr 24 21:31:17.333369 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.333285 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:17.455166 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.455132 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx"] Apr 24 21:31:17.458529 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:31:17.458497 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfe510d_cf85_4044_a0be_d31e343e5953.slice/crio-3546264bc83e1e345ac5ef5ad70628360ee758e476b073947c2054dc9d173cfb WatchSource:0}: Error finding container 3546264bc83e1e345ac5ef5ad70628360ee758e476b073947c2054dc9d173cfb: Status 404 returned error can't find the container with id 3546264bc83e1e345ac5ef5ad70628360ee758e476b073947c2054dc9d173cfb Apr 24 21:31:17.727026 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.726988 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:17.729326 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.729310 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a3fc9e80-e65f-425c-aae5-b6bbc3f9010b-certificates\") pod \"keda-operator-ffbb595cb-dmptg\" (UID: \"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b\") " pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:17.863899 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:17.863855 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:18.014359 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:18.014280 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-dmptg"] Apr 24 21:31:18.017891 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:31:18.017856 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fc9e80_e65f_425c_aae5_b6bbc3f9010b.slice/crio-2bde9876fb55e105dde9c7f90e27254d9e24420049353e6d26edd5a67be9e99c WatchSource:0}: Error finding container 2bde9876fb55e105dde9c7f90e27254d9e24420049353e6d26edd5a67be9e99c: Status 404 returned error can't find the container with id 2bde9876fb55e105dde9c7f90e27254d9e24420049353e6d26edd5a67be9e99c Apr 24 21:31:18.208698 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:18.208660 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" event={"ID":"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b","Type":"ContainerStarted","Data":"2bde9876fb55e105dde9c7f90e27254d9e24420049353e6d26edd5a67be9e99c"} Apr 24 21:31:18.210584 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:18.210543 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" event={"ID":"5dfe510d-cf85-4044-a0be-d31e343e5953","Type":"ContainerStarted","Data":"3546264bc83e1e345ac5ef5ad70628360ee758e476b073947c2054dc9d173cfb"} Apr 24 21:31:18.211697 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:18.211666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vdbpw" event={"ID":"6fcd0938-4651-4731-b26c-18ad10b88766","Type":"ContainerStarted","Data":"73ee2c339b767e37656fe0ab5473ed9c3234de4619e6efeb129c50b698445b92"} Apr 24 21:31:19.217781 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:19.217747 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vdbpw" event={"ID":"6fcd0938-4651-4731-b26c-18ad10b88766","Type":"ContainerStarted","Data":"8f84a1f8e9dd7410910ff33716d223451e60b8f837630339bc6536c512932e87"} Apr 24 21:31:19.218262 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:19.217945 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:19.242447 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:19.242382 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-vdbpw" podStartSLOduration=1.6664684699999999 podStartE2EDuration="3.242366882s" podCreationTimestamp="2026-04-24 21:31:16 +0000 UTC" firstStartedPulling="2026-04-24 21:31:17.255870772 +0000 UTC m=+232.441159089" lastFinishedPulling="2026-04-24 21:31:18.831769164 +0000 UTC m=+234.017057501" observedRunningTime="2026-04-24 21:31:19.239415119 +0000 UTC m=+234.424703461" watchObservedRunningTime="2026-04-24 21:31:19.242366882 +0000 UTC m=+234.427655221" Apr 24 21:31:21.225690 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:21.225651 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" event={"ID":"5dfe510d-cf85-4044-a0be-d31e343e5953","Type":"ContainerStarted","Data":"f9441edf66d009544b4df39e132ed1d69ef739926e1b59a33430666f9847e31f"} Apr 24 21:31:21.226221 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:21.225818 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:21.243923 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:21.243859 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" podStartSLOduration=2.554371521 podStartE2EDuration="5.243842193s" podCreationTimestamp="2026-04-24 21:31:16 +0000 UTC" firstStartedPulling="2026-04-24 21:31:17.460569909 +0000 UTC m=+232.645858230" lastFinishedPulling="2026-04-24 21:31:20.150040573 +0000 UTC m=+235.335328902" observedRunningTime="2026-04-24 21:31:21.242700334 +0000 UTC m=+236.427988683" watchObservedRunningTime="2026-04-24 21:31:21.243842193 +0000 UTC m=+236.429130533" Apr 24 21:31:22.230825 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:22.230789 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" event={"ID":"a3fc9e80-e65f-425c-aae5-b6bbc3f9010b","Type":"ContainerStarted","Data":"ed1c5a93188e251c54f5c934e467df2cb9ee83c048411fac44bd66e282c47f5a"} Apr 24 21:31:22.231228 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:22.230870 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:31:22.249229 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:22.249179 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" podStartSLOduration=2.950218426 podStartE2EDuration="6.249164852s" podCreationTimestamp="2026-04-24 21:31:16 +0000 UTC" firstStartedPulling="2026-04-24 21:31:18.019379561 +0000 UTC m=+233.204667882" lastFinishedPulling="2026-04-24 21:31:21.318325987 +0000 UTC m=+236.503614308" observedRunningTime="2026-04-24 21:31:22.249141664 +0000 UTC m=+237.434429994" watchObservedRunningTime="2026-04-24 21:31:22.249164852 +0000 UTC m=+237.434453192" Apr 24 21:31:32.235487 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:32.235458 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-b4pkx" Apr 24 21:31:37.205826 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:37.205794 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-d8rjs" Apr 24 21:31:40.223988 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:40.223935 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-vdbpw" Apr 24 21:31:43.236610 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:31:43.236536 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-dmptg" Apr 24 21:32:24.748990 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.748934 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:32:24.752245 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.752222 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.754045 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.754022 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:32:24.754171 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.754113 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-wtklc\"" Apr 24 21:32:24.754413 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.754396 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:32:24.754472 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.754418 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:32:24.769331 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.769310 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h"] Apr 24 21:32:24.772233 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.772219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:24.773854 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.773836 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:32:24.773940 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.773862 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-h4czl\"" Apr 24 21:32:24.777534 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.777514 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:32:24.786604 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.786580 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h"] Apr 24 21:32:24.847476 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.847439 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72lt6\" (UniqueName: \"kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.847643 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.847505 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:24.847643 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.847535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.847733 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.847638 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqdt\" (UniqueName: \"kubernetes.io/projected/c0e0ec12-0c87-401b-b01c-c370097ce9cd-kube-api-access-nxqdt\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:24.948528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.948490 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqdt\" (UniqueName: \"kubernetes.io/projected/c0e0ec12-0c87-401b-b01c-c370097ce9cd-kube-api-access-nxqdt\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:24.948528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.948534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72lt6\" (UniqueName: \"kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.948738 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.948576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:24.948738 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.948598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.948738 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:32:24.948715 2566 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 21:32:24.948831 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:32:24.948793 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert podName:c0e0ec12-0c87-401b-b01c-c370097ce9cd nodeName:}" failed. No retries permitted until 2026-04-24 21:32:25.44877205 +0000 UTC m=+300.634060367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert") pod "llmisvc-controller-manager-68cc5db7c4-l2j7h" (UID: "c0e0ec12-0c87-401b-b01c-c370097ce9cd") : secret "llmisvc-webhook-server-cert" not found Apr 24 21:32:24.950944 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.950922 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.962030 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.962003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72lt6\" (UniqueName: \"kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6\") pod \"kserve-controller-manager-84b6647887-xjn8m\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:24.962169 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:24.962111 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqdt\" (UniqueName: \"kubernetes.io/projected/c0e0ec12-0c87-401b-b01c-c370097ce9cd-kube-api-access-nxqdt\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:25.062911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.062804 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:25.197473 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.197450 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:32:25.199837 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:32:25.199811 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59766aaf_be95_4685_a428_e1bcce43de8a.slice/crio-178c0d48b9115d1c9b1b85dcaf0382e91ba8e6c7d4e126d8eecfbbe679967818 WatchSource:0}: Error finding container 178c0d48b9115d1c9b1b85dcaf0382e91ba8e6c7d4e126d8eecfbbe679967818: Status 404 returned error can't find the container with id 178c0d48b9115d1c9b1b85dcaf0382e91ba8e6c7d4e126d8eecfbbe679967818 Apr 24 21:32:25.267888 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.267855 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:32:25.268080 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.268053 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:32:25.429748 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.429714 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" event={"ID":"59766aaf-be95-4685-a428-e1bcce43de8a","Type":"ContainerStarted","Data":"178c0d48b9115d1c9b1b85dcaf0382e91ba8e6c7d4e126d8eecfbbe679967818"} Apr 24 21:32:25.453188 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.453155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:25.455731 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.455710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0e0ec12-0c87-401b-b01c-c370097ce9cd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-l2j7h\" (UID: \"c0e0ec12-0c87-401b-b01c-c370097ce9cd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:25.683867 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.683790 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-h4czl\"" Apr 24 21:32:25.692458 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.692423 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:25.842475 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:25.842440 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h"] Apr 24 21:32:25.845506 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:32:25.845473 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc0e0ec12_0c87_401b_b01c_c370097ce9cd.slice/crio-58599d468bd5ea3f8d6eb602ffc091e2f01ee2fdb6c52bedafca6af1405ebce2 WatchSource:0}: Error finding container 58599d468bd5ea3f8d6eb602ffc091e2f01ee2fdb6c52bedafca6af1405ebce2: Status 404 returned error can't find the container with id 58599d468bd5ea3f8d6eb602ffc091e2f01ee2fdb6c52bedafca6af1405ebce2 Apr 24 21:32:26.437695 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:26.437656 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" event={"ID":"c0e0ec12-0c87-401b-b01c-c370097ce9cd","Type":"ContainerStarted","Data":"58599d468bd5ea3f8d6eb602ffc091e2f01ee2fdb6c52bedafca6af1405ebce2"} Apr 24 21:32:29.448761 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.448722 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" event={"ID":"59766aaf-be95-4685-a428-e1bcce43de8a","Type":"ContainerStarted","Data":"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd"} Apr 24 21:32:29.449234 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.448846 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:32:29.450197 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.450175 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" event={"ID":"c0e0ec12-0c87-401b-b01c-c370097ce9cd","Type":"ContainerStarted","Data":"63c6e84acf411a14f60a7835b1c8a18985ac4c34f34f62ed099f7737a04ce258"} Apr 24 21:32:29.450307 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.450298 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:32:29.483014 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.482969 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" podStartSLOduration=2.475791578 podStartE2EDuration="5.482940061s" podCreationTimestamp="2026-04-24 21:32:24 +0000 UTC" firstStartedPulling="2026-04-24 21:32:25.201031054 +0000 UTC m=+300.386319371" lastFinishedPulling="2026-04-24 21:32:28.208179523 +0000 UTC m=+303.393467854" observedRunningTime="2026-04-24 21:32:29.481078206 +0000 UTC m=+304.666366544" watchObservedRunningTime="2026-04-24 21:32:29.482940061 +0000 UTC m=+304.668228400" Apr 24 21:32:29.498895 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:32:29.498839 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" podStartSLOduration=2.45553699 podStartE2EDuration="5.498826342s" podCreationTimestamp="2026-04-24 21:32:24 +0000 UTC" firstStartedPulling="2026-04-24 21:32:25.847142394 +0000 UTC m=+301.032430734" lastFinishedPulling="2026-04-24 21:32:28.890431769 +0000 UTC m=+304.075720086" observedRunningTime="2026-04-24 21:32:29.498083517 +0000 UTC m=+304.683371855" watchObservedRunningTime="2026-04-24 21:32:29.498826342 +0000 UTC m=+304.684114681" Apr 24 21:33:00.455877 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:00.455848 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-l2j7h" Apr 24 21:33:00.458831 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:00.458812 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:33:01.724529 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:01.724489 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:33:01.725003 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:01.724769 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" podUID="59766aaf-be95-4685-a428-e1bcce43de8a" containerName="manager" containerID="cri-o://a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd" gracePeriod=10 Apr 24 21:33:01.974847 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:01.974792 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:33:02.044932 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.044905 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72lt6\" (UniqueName: \"kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6\") pod \"59766aaf-be95-4685-a428-e1bcce43de8a\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " Apr 24 21:33:02.045106 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.044986 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert\") pod \"59766aaf-be95-4685-a428-e1bcce43de8a\" (UID: \"59766aaf-be95-4685-a428-e1bcce43de8a\") " Apr 24 21:33:02.047131 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.047092 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert" (OuterVolumeSpecName: "cert") pod "59766aaf-be95-4685-a428-e1bcce43de8a" (UID: "59766aaf-be95-4685-a428-e1bcce43de8a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:02.047131 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.047098 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6" (OuterVolumeSpecName: "kube-api-access-72lt6") pod "59766aaf-be95-4685-a428-e1bcce43de8a" (UID: "59766aaf-be95-4685-a428-e1bcce43de8a"). InnerVolumeSpecName "kube-api-access-72lt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:02.146206 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.146174 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72lt6\" (UniqueName: \"kubernetes.io/projected/59766aaf-be95-4685-a428-e1bcce43de8a-kube-api-access-72lt6\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:33:02.146206 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.146199 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59766aaf-be95-4685-a428-e1bcce43de8a-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:33:02.550004 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.549948 2566 generic.go:358] "Generic (PLEG): container finished" podID="59766aaf-be95-4685-a428-e1bcce43de8a" containerID="a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd" exitCode=0 Apr 24 21:33:02.550174 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.550024 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" Apr 24 21:33:02.550174 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.550056 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" event={"ID":"59766aaf-be95-4685-a428-e1bcce43de8a","Type":"ContainerDied","Data":"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd"} Apr 24 21:33:02.550174 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.550088 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84b6647887-xjn8m" event={"ID":"59766aaf-be95-4685-a428-e1bcce43de8a","Type":"ContainerDied","Data":"178c0d48b9115d1c9b1b85dcaf0382e91ba8e6c7d4e126d8eecfbbe679967818"} Apr 24 21:33:02.550174 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.550103 2566 scope.go:117] "RemoveContainer" containerID="a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd" Apr 24 21:33:02.558530 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.558511 2566 scope.go:117] "RemoveContainer" containerID="a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd" Apr 24 21:33:02.558767 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:33:02.558751 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd\": container with ID starting with a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd not found: ID does not exist" containerID="a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd" Apr 24 21:33:02.558811 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.558774 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd"} err="failed to get container status \"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd\": rpc error: code = NotFound desc = could not find container \"a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd\": container with ID starting with a2ed37e5f6c5b0ba72a16f5b13ba86c117d262412621c8107d6fe7bf8457e7dd not found: ID does not exist" Apr 24 21:33:02.571402 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.571378 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:33:02.577586 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:02.577565 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84b6647887-xjn8m"] Apr 24 21:33:03.380481 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:03.380445 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59766aaf-be95-4685-a428-e1bcce43de8a" path="/var/lib/kubelet/pods/59766aaf-be95-4685-a428-e1bcce43de8a/volumes" Apr 24 21:33:36.793477 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.793439 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-77928"] Apr 24 21:33:36.794017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.793776 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59766aaf-be95-4685-a428-e1bcce43de8a" containerName="manager" Apr 24 21:33:36.794017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.793788 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="59766aaf-be95-4685-a428-e1bcce43de8a" containerName="manager" Apr 24 21:33:36.794017 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.793834 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="59766aaf-be95-4685-a428-e1bcce43de8a" containerName="manager" Apr 24 21:33:36.797967 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.797938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:36.800719 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.800696 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:33:36.801257 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.801235 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-xpcvr\"" Apr 24 21:33:36.802365 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.802344 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-849fm"] Apr 24 21:33:36.806314 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.806020 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-77928"] Apr 24 21:33:36.806314 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.806138 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:36.807684 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.807663 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:33:36.808185 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.808158 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-dmq9g\"" Apr 24 21:33:36.815559 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.815534 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-849fm"] Apr 24 21:33:36.923326 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.923286 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj98c\" (UniqueName: \"kubernetes.io/projected/7a306542-2ddf-404d-b60d-11970d6e6b42-kube-api-access-bj98c\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:36.923502 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.923344 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1df53da7-4cdf-4e85-8366-118c8499eab3-tls-certs\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:36.923502 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.923380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktb86\" (UniqueName: \"kubernetes.io/projected/1df53da7-4cdf-4e85-8366-118c8499eab3-kube-api-access-ktb86\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:36.923502 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:36.923487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.024136 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.024097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.024294 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.024160 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj98c\" (UniqueName: \"kubernetes.io/projected/7a306542-2ddf-404d-b60d-11970d6e6b42-kube-api-access-bj98c\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.024294 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.024203 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1df53da7-4cdf-4e85-8366-118c8499eab3-tls-certs\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:37.024294 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.024224 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktb86\" (UniqueName: \"kubernetes.io/projected/1df53da7-4cdf-4e85-8366-118c8499eab3-kube-api-access-ktb86\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:37.024294 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:33:37.024240 2566 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:33:37.024473 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:33:37.024308 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert podName:7a306542-2ddf-404d-b60d-11970d6e6b42 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:37.524290335 +0000 UTC m=+372.709578652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert") pod "odh-model-controller-696fc77849-849fm" (UID: "7a306542-2ddf-404d-b60d-11970d6e6b42") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:33:37.026677 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.026656 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1df53da7-4cdf-4e85-8366-118c8499eab3-tls-certs\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:37.033191 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.033168 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktb86\" (UniqueName: \"kubernetes.io/projected/1df53da7-4cdf-4e85-8366-118c8499eab3-kube-api-access-ktb86\") pod \"model-serving-api-86f7b4b499-77928\" (UID: \"1df53da7-4cdf-4e85-8366-118c8499eab3\") " pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:37.033565 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.033546 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj98c\" (UniqueName: \"kubernetes.io/projected/7a306542-2ddf-404d-b60d-11970d6e6b42-kube-api-access-bj98c\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.111116 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.111015 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:37.231067 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.231032 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-77928"] Apr 24 21:33:37.234203 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:33:37.234173 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df53da7_4cdf_4e85_8366_118c8499eab3.slice/crio-88020c057debbc7420740c67aefc40bf5962cc81e74d24e0864dd5803bd8d0f5 WatchSource:0}: Error finding container 88020c057debbc7420740c67aefc40bf5962cc81e74d24e0864dd5803bd8d0f5: Status 404 returned error can't find the container with id 88020c057debbc7420740c67aefc40bf5962cc81e74d24e0864dd5803bd8d0f5 Apr 24 21:33:37.235922 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.235907 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:33:37.528215 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.528168 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.530527 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.530495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a306542-2ddf-404d-b60d-11970d6e6b42-cert\") pod \"odh-model-controller-696fc77849-849fm\" (UID: \"7a306542-2ddf-404d-b60d-11970d6e6b42\") " pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.659237 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.659200 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-77928" event={"ID":"1df53da7-4cdf-4e85-8366-118c8499eab3","Type":"ContainerStarted","Data":"88020c057debbc7420740c67aefc40bf5962cc81e74d24e0864dd5803bd8d0f5"} Apr 24 21:33:37.719565 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.719529 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:37.852091 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:37.852052 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-849fm"] Apr 24 21:33:37.856089 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:33:37.856052 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a306542_2ddf_404d_b60d_11970d6e6b42.slice/crio-8dd78e03bf36a574ce3241dd783724faa8e7ea21d16540cbce53b3461337aac5 WatchSource:0}: Error finding container 8dd78e03bf36a574ce3241dd783724faa8e7ea21d16540cbce53b3461337aac5: Status 404 returned error can't find the container with id 8dd78e03bf36a574ce3241dd783724faa8e7ea21d16540cbce53b3461337aac5 Apr 24 21:33:38.666806 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:38.666766 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-849fm" event={"ID":"7a306542-2ddf-404d-b60d-11970d6e6b42","Type":"ContainerStarted","Data":"8dd78e03bf36a574ce3241dd783724faa8e7ea21d16540cbce53b3461337aac5"} Apr 24 21:33:39.672563 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:39.672521 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-77928" event={"ID":"1df53da7-4cdf-4e85-8366-118c8499eab3","Type":"ContainerStarted","Data":"51840fdd0a4b4bdfcef35f4b776dc4b9ccac3f8c89a4982a4ffd4cb664d6ad19"} Apr 24 21:33:39.673029 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:39.672586 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:40.677341 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:40.677241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-849fm" event={"ID":"7a306542-2ddf-404d-b60d-11970d6e6b42","Type":"ContainerStarted","Data":"1858226ebc1155705dcca75bb6c99ebd91d04a03a60474ad53258cc6acfe1766"} Apr 24 21:33:40.677759 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:40.677462 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:40.697733 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:40.697660 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-849fm" podStartSLOduration=2.13358227 podStartE2EDuration="4.697641145s" podCreationTimestamp="2026-04-24 21:33:36 +0000 UTC" firstStartedPulling="2026-04-24 21:33:37.857639771 +0000 UTC m=+373.042928092" lastFinishedPulling="2026-04-24 21:33:40.421698648 +0000 UTC m=+375.606986967" observedRunningTime="2026-04-24 21:33:40.696133991 +0000 UTC m=+375.881422329" watchObservedRunningTime="2026-04-24 21:33:40.697641145 +0000 UTC m=+375.882929485" Apr 24 21:33:40.698604 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:40.698555 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-77928" podStartSLOduration=3.123732311 podStartE2EDuration="4.698543212s" podCreationTimestamp="2026-04-24 21:33:36 +0000 UTC" firstStartedPulling="2026-04-24 21:33:37.236052258 +0000 UTC m=+372.421340589" lastFinishedPulling="2026-04-24 21:33:38.810863154 +0000 UTC m=+373.996151490" observedRunningTime="2026-04-24 21:33:39.688606864 +0000 UTC m=+374.873895202" watchObservedRunningTime="2026-04-24 21:33:40.698543212 +0000 UTC m=+375.883831552" Apr 24 21:33:41.634976 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.634911 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c78c8bd4b-nkth6"] Apr 24 21:33:41.638614 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.638590 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.642723 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.642686 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c78c8bd4b-nkth6"] Apr 24 21:33:41.766497 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766455 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-oauth-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-trusted-ca-bundle\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766545 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766661 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxz56\" (UniqueName: \"kubernetes.io/projected/3d2a3bdb-2eed-461e-9683-5cf79273db75-kube-api-access-dxz56\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766696 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-service-ca\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766780 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.766911 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.766809 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-oauth-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.867811 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.867771 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-service-ca\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868055 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.867821 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868055 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.867951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-oauth-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868183 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868063 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-oauth-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868183 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-trusted-ca-bundle\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868183 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868322 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868184 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxz56\" (UniqueName: \"kubernetes.io/projected/3d2a3bdb-2eed-461e-9683-5cf79273db75-kube-api-access-dxz56\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868669 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-oauth-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868788 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-service-ca\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.868982 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-trusted-ca-bundle\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.869054 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.868952 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.870358 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.870337 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-serving-cert\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.870460 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.870352 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d2a3bdb-2eed-461e-9683-5cf79273db75-console-oauth-config\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.878147 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.878124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxz56\" (UniqueName: \"kubernetes.io/projected/3d2a3bdb-2eed-461e-9683-5cf79273db75-kube-api-access-dxz56\") pod \"console-6c78c8bd4b-nkth6\" (UID: \"3d2a3bdb-2eed-461e-9683-5cf79273db75\") " pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:41.949240 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:41.949199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:42.069439 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:42.069392 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c78c8bd4b-nkth6"] Apr 24 21:33:42.072333 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:33:42.072306 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2a3bdb_2eed_461e_9683_5cf79273db75.slice/crio-99d4c09457aa5bc57edd28617fa59a072c08f9853b0f762e55a75c2af16786bd WatchSource:0}: Error finding container 99d4c09457aa5bc57edd28617fa59a072c08f9853b0f762e55a75c2af16786bd: Status 404 returned error can't find the container with id 99d4c09457aa5bc57edd28617fa59a072c08f9853b0f762e55a75c2af16786bd Apr 24 21:33:42.684821 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:42.684787 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c78c8bd4b-nkth6" event={"ID":"3d2a3bdb-2eed-461e-9683-5cf79273db75","Type":"ContainerStarted","Data":"70fe43118c017110c390a2eb56f3d80335c3fa64ab35f2236484cfd8cc9adede"} Apr 24 21:33:42.684821 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:42.684820 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c78c8bd4b-nkth6" event={"ID":"3d2a3bdb-2eed-461e-9683-5cf79273db75","Type":"ContainerStarted","Data":"99d4c09457aa5bc57edd28617fa59a072c08f9853b0f762e55a75c2af16786bd"} Apr 24 21:33:42.704395 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:42.704346 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c78c8bd4b-nkth6" podStartSLOduration=1.704328106 podStartE2EDuration="1.704328106s" podCreationTimestamp="2026-04-24 21:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:42.702661706 +0000 UTC m=+377.887950045" watchObservedRunningTime="2026-04-24 21:33:42.704328106 +0000 UTC m=+377.889616446" Apr 24 21:33:50.681780 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:50.681748 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-77928" Apr 24 21:33:51.683667 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:51.683638 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-849fm" Apr 24 21:33:51.950104 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:51.950001 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:51.950104 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:51.950065 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:51.954872 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:51.954849 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:52.721405 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:52.721375 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c78c8bd4b-nkth6" Apr 24 21:33:52.791582 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:33:52.791550 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:34:17.812455 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:17.812385 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dd9f5dbfb-btccq" podUID="215db3af-4b00-43ad-887d-b028bf9befeb" containerName="console" containerID="cri-o://067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b" gracePeriod=15 Apr 24 21:34:18.040044 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.040023 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd9f5dbfb-btccq_215db3af-4b00-43ad-887d-b028bf9befeb/console/0.log" Apr 24 21:34:18.040163 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.040081 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:34:18.172459 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172429 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172631 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172469 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172631 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172513 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172631 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172561 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172631 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172615 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172834 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172647 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172834 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172688 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9l77\" (UniqueName: \"kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77\") pod \"215db3af-4b00-43ad-887d-b028bf9befeb\" (UID: \"215db3af-4b00-43ad-887d-b028bf9befeb\") " Apr 24 21:34:18.172940 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172837 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca" (OuterVolumeSpecName: "service-ca") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:18.173034 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.172941 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-service-ca\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.173034 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.173005 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:18.173182 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.173155 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config" (OuterVolumeSpecName: "console-config") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:18.173299 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.173266 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:18.174986 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.174942 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:18.175097 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.175053 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77" (OuterVolumeSpecName: "kube-api-access-w9l77") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "kube-api-access-w9l77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:18.175097 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.175073 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "215db3af-4b00-43ad-887d-b028bf9befeb" (UID: "215db3af-4b00-43ad-887d-b028bf9befeb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:18.273831 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273792 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-console-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.273831 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273825 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-oauth-config\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.273831 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273834 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-trusted-ca-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.274100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273844 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215db3af-4b00-43ad-887d-b028bf9befeb-console-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.274100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273853 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9l77\" (UniqueName: \"kubernetes.io/projected/215db3af-4b00-43ad-887d-b028bf9befeb-kube-api-access-w9l77\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.274100 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.273862 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215db3af-4b00-43ad-887d-b028bf9befeb-oauth-serving-cert\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:34:18.807071 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807046 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd9f5dbfb-btccq_215db3af-4b00-43ad-887d-b028bf9befeb/console/0.log" Apr 24 21:34:18.807241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807084 2566 generic.go:358] "Generic (PLEG): container finished" podID="215db3af-4b00-43ad-887d-b028bf9befeb" containerID="067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b" exitCode=2 Apr 24 21:34:18.807241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807121 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd9f5dbfb-btccq" event={"ID":"215db3af-4b00-43ad-887d-b028bf9befeb","Type":"ContainerDied","Data":"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b"} Apr 24 21:34:18.807241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807149 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd9f5dbfb-btccq" Apr 24 21:34:18.807241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807164 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd9f5dbfb-btccq" event={"ID":"215db3af-4b00-43ad-887d-b028bf9befeb","Type":"ContainerDied","Data":"2342ffa6c4c4178b69c60743f741423e950511757e48bb7d4dabb01768081392"} Apr 24 21:34:18.807241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.807177 2566 scope.go:117] "RemoveContainer" containerID="067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b" Apr 24 21:34:18.815336 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.815206 2566 scope.go:117] "RemoveContainer" containerID="067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b" Apr 24 21:34:18.815588 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:34:18.815495 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b\": container with ID starting with 067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b not found: ID does not exist" containerID="067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b" Apr 24 21:34:18.815588 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.815522 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b"} err="failed to get container status \"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b\": rpc error: code = NotFound desc = could not find container \"067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b\": container with ID starting with 067fdce6ef51a099a64fbc3242e492281a0fda24cdaa796f4630d25052d5b29b not found: ID does not exist" Apr 24 21:34:18.828377 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.828348 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:34:18.837355 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:18.837329 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dd9f5dbfb-btccq"] Apr 24 21:34:19.380550 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:34:19.380520 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215db3af-4b00-43ad-887d-b028bf9befeb" path="/var/lib/kubelet/pods/215db3af-4b00-43ad-887d-b028bf9befeb/volumes" Apr 24 21:37:25.288084 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:25.288053 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:37:25.290467 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:25.290447 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:37:37.390216 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.390184 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:37:37.392544 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.390483 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="215db3af-4b00-43ad-887d-b028bf9befeb" containerName="console" Apr 24 21:37:37.392544 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.390493 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="215db3af-4b00-43ad-887d-b028bf9befeb" containerName="console" Apr 24 21:37:37.392544 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.390559 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="215db3af-4b00-43ad-887d-b028bf9befeb" containerName="console" Apr 24 21:37:37.393430 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.393414 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.395419 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.395397 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-7bb74-kube-rbac-proxy-sar-config\"" Apr 24 21:37:37.395528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.395482 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-7bb74-serving-cert\"" Apr 24 21:37:37.395528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.395492 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:37.395528 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.395502 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8t9rb\"" Apr 24 21:37:37.402291 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.402270 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:37:37.494945 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.494911 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.495144 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.494999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.596135 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.596095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.596322 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.596159 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.596774 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.596752 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.598674 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.598650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls\") pod \"model-chainer-raw-7bb74-5899c94ff8-n7d2h\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.704190 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.704158 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:37.825210 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:37.825035 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:37:37.827811 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:37:37.827784 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49ae6ba_9b34_4cec_a9b7_537a3730a194.slice/crio-8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d WatchSource:0}: Error finding container 8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d: Status 404 returned error can't find the container with id 8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d Apr 24 21:37:38.430944 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:38.430906 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" event={"ID":"f49ae6ba-9b34-4cec-a9b7-537a3730a194","Type":"ContainerStarted","Data":"8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d"} Apr 24 21:37:40.438643 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:40.438616 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" event={"ID":"f49ae6ba-9b34-4cec-a9b7-537a3730a194","Type":"ContainerStarted","Data":"52ddd3eb2394e8b40afbd77aaa27616718d450b9243c39253c167b49ae43a201"} Apr 24 21:37:40.439027 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:40.438670 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:40.456481 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:40.456432 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podStartSLOduration=1.340625701 podStartE2EDuration="3.456414783s" podCreationTimestamp="2026-04-24 21:37:37 +0000 UTC" firstStartedPulling="2026-04-24 21:37:37.829456158 +0000 UTC m=+613.014744475" lastFinishedPulling="2026-04-24 21:37:39.945245241 +0000 UTC m=+615.130533557" observedRunningTime="2026-04-24 21:37:40.455612946 +0000 UTC m=+615.640901286" watchObservedRunningTime="2026-04-24 21:37:40.456414783 +0000 UTC m=+615.641703122" Apr 24 21:37:46.449027 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:46.448993 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:37:47.417576 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:47.417544 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:37:47.417802 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:47.417779 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" containerID="cri-o://52ddd3eb2394e8b40afbd77aaa27616718d450b9243c39253c167b49ae43a201" gracePeriod=30 Apr 24 21:37:51.446701 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:51.446642 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:56.446985 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:37:56.446932 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:01.447456 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:01.447417 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:01.447825 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:01.447526 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:38:06.446922 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:06.446880 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:11.447221 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:11.447174 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:16.446856 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:16.446814 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:17.556568 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.556538 2566 generic.go:358] "Generic (PLEG): container finished" podID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerID="52ddd3eb2394e8b40afbd77aaa27616718d450b9243c39253c167b49ae43a201" exitCode=0 Apr 24 21:38:17.557023 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.556613 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" event={"ID":"f49ae6ba-9b34-4cec-a9b7-537a3730a194","Type":"ContainerDied","Data":"52ddd3eb2394e8b40afbd77aaa27616718d450b9243c39253c167b49ae43a201"} Apr 24 21:38:17.557023 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.556647 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" event={"ID":"f49ae6ba-9b34-4cec-a9b7-537a3730a194","Type":"ContainerDied","Data":"8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d"} Apr 24 21:38:17.557023 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.556657 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f5fea7b6d1d240e0b1304419903542dd411e755920665845b041686a98d5a7d" Apr 24 21:38:17.560395 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.560379 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:38:17.611089 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.611047 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls\") pod \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " Apr 24 21:38:17.611267 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.611118 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle\") pod \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\" (UID: \"f49ae6ba-9b34-4cec-a9b7-537a3730a194\") " Apr 24 21:38:17.611472 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.611447 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "f49ae6ba-9b34-4cec-a9b7-537a3730a194" (UID: "f49ae6ba-9b34-4cec-a9b7-537a3730a194"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:38:17.613289 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.613264 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f49ae6ba-9b34-4cec-a9b7-537a3730a194" (UID: "f49ae6ba-9b34-4cec-a9b7-537a3730a194"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:17.712300 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.712216 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f49ae6ba-9b34-4cec-a9b7-537a3730a194-proxy-tls\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:38:17.712300 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:17.712243 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f49ae6ba-9b34-4cec-a9b7-537a3730a194-openshift-service-ca-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:38:18.559411 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:18.559378 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h" Apr 24 21:38:18.583415 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:18.583387 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:38:18.587614 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:18.587589 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-7bb74-5899c94ff8-n7d2h"] Apr 24 21:38:19.380679 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:38:19.380636 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" path="/var/lib/kubelet/pods/f49ae6ba-9b34-4cec-a9b7-537a3730a194/volumes" Apr 24 21:39:17.663499 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.663418 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:17.663923 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.663756 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" Apr 24 21:39:17.663923 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.663768 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" Apr 24 21:39:17.663923 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.663812 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f49ae6ba-9b34-4cec-a9b7-537a3730a194" containerName="model-chainer-raw-7bb74" Apr 24 21:39:17.666630 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.666609 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:17.668591 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.668571 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-21211-kube-rbac-proxy-sar-config\"" Apr 24 21:39:17.668708 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.668589 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:39:17.668708 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.668590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-8t9rb\"" Apr 24 21:39:17.668865 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.668844 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-21211-serving-cert\"" Apr 24 21:39:17.679972 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.676135 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:17.698293 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.698261 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:17.698428 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.698296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:17.799455 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.799424 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:17.799455 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.799461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:17.799686 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:39:17.799592 2566 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-21211-serving-cert: secret "model-chainer-raw-hpa-21211-serving-cert" not found Apr 24 21:39:17.799686 ip-10-0-131-55 kubenswrapper[2566]: E0424 21:39:17.799647 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls podName:86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b nodeName:}" failed. No retries permitted until 2026-04-24 21:39:18.299631757 +0000 UTC m=+713.484920074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls") pod "model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" (UID: "86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b") : secret "model-chainer-raw-hpa-21211-serving-cert" not found Apr 24 21:39:17.800123 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:17.800103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:18.303511 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.303467 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:18.306066 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.306034 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") pod \"model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:18.584451 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.584356 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:18.703883 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.703850 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:18.706743 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:39:18.706716 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bf6e33_9a0f_4a7b_8f46_856d8cc89f5b.slice/crio-a5a35cf60f7844e80fa6b4c3a6d8a577f96a8a4e306d4cff3e05e2888df4eab1 WatchSource:0}: Error finding container a5a35cf60f7844e80fa6b4c3a6d8a577f96a8a4e306d4cff3e05e2888df4eab1: Status 404 returned error can't find the container with id a5a35cf60f7844e80fa6b4c3a6d8a577f96a8a4e306d4cff3e05e2888df4eab1 Apr 24 21:39:18.708743 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.708724 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:39:18.743662 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:18.743633 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" event={"ID":"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b","Type":"ContainerStarted","Data":"a5a35cf60f7844e80fa6b4c3a6d8a577f96a8a4e306d4cff3e05e2888df4eab1"} Apr 24 21:39:19.748418 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:19.748382 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" event={"ID":"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b","Type":"ContainerStarted","Data":"85115198ea63b95c338fcbec46184af9bc914d79329ee99bd74617d8ab914a2a"} Apr 24 21:39:19.748820 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:19.748478 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:19.765390 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:19.765276 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podStartSLOduration=2.765258569 podStartE2EDuration="2.765258569s" podCreationTimestamp="2026-04-24 21:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:19.764419125 +0000 UTC m=+714.949707464" watchObservedRunningTime="2026-04-24 21:39:19.765258569 +0000 UTC m=+714.950546909" Apr 24 21:39:25.756778 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:25.756751 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:27.735724 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:27.735694 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:27.736168 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:27.735931 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" containerID="cri-o://85115198ea63b95c338fcbec46184af9bc914d79329ee99bd74617d8ab914a2a" gracePeriod=30 Apr 24 21:39:30.755744 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:30.755706 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:35.755105 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:35.755062 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:40.755040 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:40.754998 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:40.755402 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:40.755110 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:45.755409 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:45.755374 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:50.755422 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:50.755378 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:55.755716 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:55.755676 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:57.866762 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:57.866725 2566 generic.go:358] "Generic (PLEG): container finished" podID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerID="85115198ea63b95c338fcbec46184af9bc914d79329ee99bd74617d8ab914a2a" exitCode=0 Apr 24 21:39:57.867147 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:57.866799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" event={"ID":"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b","Type":"ContainerDied","Data":"85115198ea63b95c338fcbec46184af9bc914d79329ee99bd74617d8ab914a2a"} Apr 24 21:39:58.379271 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.379237 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:58.425263 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.425222 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") pod \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " Apr 24 21:39:58.425431 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.425318 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle\") pod \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\" (UID: \"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b\") " Apr 24 21:39:58.425662 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.425640 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" (UID: "86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:58.427241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.427219 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" (UID: "86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:58.526405 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.526317 2566 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-openshift-service-ca-bundle\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:39:58.526405 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.526350 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b-proxy-tls\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 24 21:39:58.871216 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.871138 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" Apr 24 21:39:58.871621 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.871144 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb" event={"ID":"86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b","Type":"ContainerDied","Data":"a5a35cf60f7844e80fa6b4c3a6d8a577f96a8a4e306d4cff3e05e2888df4eab1"} Apr 24 21:39:58.871621 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.871257 2566 scope.go:117] "RemoveContainer" containerID="85115198ea63b95c338fcbec46184af9bc914d79329ee99bd74617d8ab914a2a" Apr 24 21:39:58.895438 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.895412 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:58.898067 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:58.898042 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-21211-64b57f5b4b-hc8gb"] Apr 24 21:39:59.380946 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:39:59.380915 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" path="/var/lib/kubelet/pods/86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b/volumes" Apr 24 21:42:25.308903 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:42:25.308871 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:42:25.314246 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:42:25.314223 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:44:25.337047 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:44:25.337016 2566 scope.go:117] "RemoveContainer" containerID="52ddd3eb2394e8b40afbd77aaa27616718d450b9243c39253c167b49ae43a201" Apr 24 21:47:25.329991 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:47:25.329942 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:47:25.335236 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:47:25.335216 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:48:42.252120 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:42.252086 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-46s9w_5deb9c1a-4ef3-4e20-a3b3-5b3fb0ee1cfd/global-pull-secret-syncer/0.log" Apr 24 21:48:42.457033 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:42.457002 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vv85h_719c9974-f956-4125-bc24-da51ad2c4d61/konnectivity-agent/0.log" Apr 24 21:48:42.478327 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:42.478293 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-55.ec2.internal_ec74c72a1ae3da2b3b1eef59bb72e15d/haproxy/0.log" Apr 24 21:48:45.874700 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:45.874672 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5d4ff5bf67-dwkdb_fd000c3f-ff04-44bb-ab20-f064e13433e8/metrics-server/0.log" Apr 24 21:48:46.120475 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.120447 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r2tp8_79217fe6-125e-4107-b954-edf725cbf5a4/node-exporter/0.log" Apr 24 21:48:46.145826 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.145759 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r2tp8_79217fe6-125e-4107-b954-edf725cbf5a4/kube-rbac-proxy/0.log" Apr 24 21:48:46.173920 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.173897 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-r2tp8_79217fe6-125e-4107-b954-edf725cbf5a4/init-textfile/0.log" Apr 24 21:48:46.205070 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.205044 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2gdkt_b960a6d8-6c0d-4f69-b153-4c3176aa6145/kube-rbac-proxy-main/0.log" Apr 24 21:48:46.233852 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.233832 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2gdkt_b960a6d8-6c0d-4f69-b153-4c3176aa6145/kube-rbac-proxy-self/0.log" Apr 24 21:48:46.262224 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.262204 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2gdkt_b960a6d8-6c0d-4f69-b153-4c3176aa6145/openshift-state-metrics/0.log" Apr 24 21:48:46.599642 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.599617 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5597f7ddb8-xbn5m_dc45bc44-5f31-4395-9780-dbb505c72767/telemeter-client/0.log" Apr 24 21:48:46.634309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.634287 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5597f7ddb8-xbn5m_dc45bc44-5f31-4395-9780-dbb505c72767/reload/0.log" Apr 24 21:48:46.663749 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.663725 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5597f7ddb8-xbn5m_dc45bc44-5f31-4395-9780-dbb505c72767/kube-rbac-proxy/0.log" Apr 24 21:48:46.697301 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.697277 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/thanos-query/0.log" Apr 24 21:48:46.725773 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.725749 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/kube-rbac-proxy-web/0.log" Apr 24 21:48:46.758790 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.758767 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/kube-rbac-proxy/0.log" Apr 24 21:48:46.788649 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.788629 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/prom-label-proxy/0.log" Apr 24 21:48:46.816927 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.816909 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/kube-rbac-proxy-rules/0.log" Apr 24 21:48:46.846665 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:46.846646 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c88dc86bd-p22fx_6eb8166d-889c-49d2-b8fa-9193325b23a7/kube-rbac-proxy-metrics/0.log" Apr 24 21:48:48.966692 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:48.966656 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c78c8bd4b-nkth6_3d2a3bdb-2eed-461e-9683-5cf79273db75/console/0.log" Apr 24 21:48:49.434919 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.434888 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr"] Apr 24 21:48:49.435241 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.435227 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" Apr 24 21:48:49.435309 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.435244 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" Apr 24 21:48:49.435347 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.435310 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="86bf6e33-9a0f-4a7b-8f46-856d8cc89f5b" containerName="model-chainer-raw-hpa-21211" Apr 24 21:48:49.438215 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.438198 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.440129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.440107 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"openshift-service-ca.crt\"" Apr 24 21:48:49.440446 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.440430 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ngn69\"/\"default-dockercfg-b5l6f\"" Apr 24 21:48:49.440497 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.440479 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"kube-root-ca.crt\"" Apr 24 21:48:49.447545 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.447525 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr"] Apr 24 21:48:49.523028 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.522998 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-sys\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.523187 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.523039 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-lib-modules\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.523187 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.523066 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-podres\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.523187 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.523140 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwqh\" (UniqueName: \"kubernetes.io/projected/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-kube-api-access-6dwqh\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.523296 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.523185 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-proc\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624003 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.623947 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-lib-modules\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624003 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624007 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-podres\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624045 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwqh\" (UniqueName: \"kubernetes.io/projected/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-kube-api-access-6dwqh\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-proc\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624100 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-sys\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-podres\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-lib-modules\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-sys\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.624232 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.624178 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-proc\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.632001 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.631975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwqh\" (UniqueName: \"kubernetes.io/projected/a9f09f9d-c0bb-412d-9ed0-f22b518f823a-kube-api-access-6dwqh\") pod \"perf-node-gather-daemonset-nspfr\" (UID: \"a9f09f9d-c0bb-412d-9ed0-f22b518f823a\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.748731 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.748641 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:49.863129 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.863104 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr"] Apr 24 21:48:49.868142 ip-10-0-131-55 kubenswrapper[2566]: W0424 21:48:49.868116 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda9f09f9d_c0bb_412d_9ed0_f22b518f823a.slice/crio-3756f4e58172e5784e5378554a59f68f7a0a0735698602bd6812df9dbadfa956 WatchSource:0}: Error finding container 3756f4e58172e5784e5378554a59f68f7a0a0735698602bd6812df9dbadfa956: Status 404 returned error can't find the container with id 3756f4e58172e5784e5378554a59f68f7a0a0735698602bd6812df9dbadfa956 Apr 24 21:48:49.869792 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:49.869772 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:48:50.196392 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.196362 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwj8d_5b9e2681-cf55-4344-bdcc-7a3176e775c3/dns/0.log" Apr 24 21:48:50.222271 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.222246 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwj8d_5b9e2681-cf55-4344-bdcc-7a3176e775c3/kube-rbac-proxy/0.log" Apr 24 21:48:50.343242 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.343216 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9lr5f_30105400-c97c-4cc8-ac91-9f6cfe32780b/dns-node-resolver/0.log" Apr 24 21:48:50.522244 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.522165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" event={"ID":"a9f09f9d-c0bb-412d-9ed0-f22b518f823a","Type":"ContainerStarted","Data":"901bc0990047cae19ebe009328c52ed6abcdc55d780c810350aca3e518181e09"} Apr 24 21:48:50.522244 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.522199 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" event={"ID":"a9f09f9d-c0bb-412d-9ed0-f22b518f823a","Type":"ContainerStarted","Data":"3756f4e58172e5784e5378554a59f68f7a0a0735698602bd6812df9dbadfa956"} Apr 24 21:48:50.522455 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.522334 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:48:50.540066 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.540023 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" podStartSLOduration=1.54001001 podStartE2EDuration="1.54001001s" podCreationTimestamp="2026-04-24 21:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:50.538003699 +0000 UTC m=+1285.723292038" watchObservedRunningTime="2026-04-24 21:48:50.54001001 +0000 UTC m=+1285.725298350" Apr 24 21:48:50.835509 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.835434 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-847fd97c69-9sv27_64b8fe7c-275c-42d9-88e4-e27695b15732/registry/0.log" Apr 24 21:48:50.883717 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:50.883694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7pgjs_6d8e7b20-2410-4675-a443-408c37cdef11/node-ca/0.log" Apr 24 21:48:52.086342 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:52.086311 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fgkjf_117a76ab-9714-4cfb-a82c-ed5796386584/serve-healthcheck-canary/0.log" Apr 24 21:48:52.500662 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:52.500638 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6276r_a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c/kube-rbac-proxy/0.log" Apr 24 21:48:52.524412 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:52.524392 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6276r_a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c/exporter/0.log" Apr 24 21:48:52.549950 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:52.549929 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6276r_a273a8f0-e8c5-4a44-a5c8-ae8cd3910e9c/extractor/0.log" Apr 24 21:48:54.679255 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:54.679231 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-l2j7h_c0e0ec12-0c87-401b-b01c-c370097ce9cd/manager/0.log" Apr 24 21:48:54.712597 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:54.712575 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-77928_1df53da7-4cdf-4e85-8366-118c8499eab3/server/0.log" Apr 24 21:48:54.808905 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:54.808879 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-849fm_7a306542-2ddf-404d-b60d-11970d6e6b42/manager/0.log" Apr 24 21:48:56.535245 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:48:56.535210 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-nspfr" Apr 24 21:49:00.633851 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.633825 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/kube-multus-additional-cni-plugins/0.log" Apr 24 21:49:00.666162 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.666107 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/egress-router-binary-copy/0.log" Apr 24 21:49:00.696508 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.696487 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/cni-plugins/0.log" Apr 24 21:49:00.726971 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.726939 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/bond-cni-plugin/0.log" Apr 24 21:49:00.754136 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.754118 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/routeoverride-cni/0.log" Apr 24 21:49:00.783305 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.783284 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/whereabouts-cni-bincopy/0.log" Apr 24 21:49:00.815147 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:00.815128 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-64swc_abed56c4-528e-496e-b85c-a6fe11c4f6e3/whereabouts-cni/0.log" Apr 24 21:49:01.300023 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:01.299993 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sslzj_c4524175-6c2f-4026-ac93-751748e5a1c4/kube-multus/0.log" Apr 24 21:49:01.402272 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:01.402248 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5b7z5_9f38edd3-fb52-42bc-b164-d84e78cffcc0/network-metrics-daemon/0.log" Apr 24 21:49:01.434818 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:01.434795 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5b7z5_9f38edd3-fb52-42bc-b164-d84e78cffcc0/kube-rbac-proxy/0.log" Apr 24 21:49:02.364560 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.364527 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-controller/0.log" Apr 24 21:49:02.387396 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.387374 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/0.log" Apr 24 21:49:02.393751 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.393704 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovn-acl-logging/1.log" Apr 24 21:49:02.413385 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.413367 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/kube-rbac-proxy-node/0.log" Apr 24 21:49:02.437323 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.437288 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:49:02.462835 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.462813 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/northd/0.log" Apr 24 21:49:02.486880 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.486862 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/nbdb/0.log" Apr 24 21:49:02.511840 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.511818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/sbdb/0.log" Apr 24 21:49:02.622723 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:02.622638 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7kvjv_e5028c4f-ef6b-4051-a2c3-1def0a14889f/ovnkube-controller/0.log" Apr 24 21:49:04.332566 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:04.332541 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kwjx6_7799825c-48bb-465a-adc5-2d2c43a525df/network-check-target-container/0.log" Apr 24 21:49:05.281055 ip-10-0-131-55 kubenswrapper[2566]: I0424 21:49:05.281027 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cqnxb_6fcca08a-b4ba-4f45-862a-1e503776cfe8/iptables-alerter/0.log"